Magazine Contemporary Culture

Science

T. S Eliot, Tradition and the Individual Talent

“Tradition and the Individual Talent” (1919) is an essay written by poet and literary critic T. S. Eliot. The essay was first published in The Egoist (1919) and later in Eliot’s first book of criticism, “The Sacred Wood” (1920). The essay is also available in Eliot’s “Selected Prose” and “Selected Essays”.

While Eliot is most often known for his poetry, he also contributed to the field of literary criticism. In this dual role, he acted as poet-critic, comparable to Sir Philip Sidney and Samuel Taylor Coleridge. “Tradition and the Individual Talent” is one of the more well known works that Eliot produced in his critic capacity. It formulates Eliot’s influential conception of the relationship between the poet and the literary tradition which precedes them.

This essay is divided into three parts that are:

part one: The Concept of “Tradition”.
part two: The Theory of Impersonal Poetry.
part three: The Conclusion or Summing up.

Eliot presents his conception of tradition and the definition of the poet and poetry in relation to it. He wishes to correct the fact that, as he perceives it, “in English writing we seldom speak of tradition, though we occasionally apply its name in deploring its absence.” Eliot posits that, though the English tradition generally upholds the belief that art progresses through change – a separation from tradition, literary advancements are instead recognised only when they conform to the tradition. Eliot, a classicist, felt that the true incorporation of tradition into literature was unrecognised, that tradition, a word that “seldom… appear[s] except in a phrase of censure,” was actually a thus-far unrealised element of literary criticism.

For Eliot, the term “tradition” is imbued with a special and complex character. It represents a “simultaneous order,” by which Eliot means a historical timelessness – a fusion of past and present – and, at the same time, a sense of present temporality. A poet must embody “the whole of the literature of Europe from Homer,” while, simultaneously, expressing their contemporary environment. Eliot challenges the common perception that a poet’s greatness and individuality lie in their departure from their predecessors; he argues that “the most individual parts of his [the poet’s] work may be those in which the dead poets, his ancestors, assert their immortality most vigorously.” Eliot claims that this “historical sense” is not only a resemblance to traditional works but an awareness and understanding of their relation to his poetry.

T. S. Eliot, Tradition and the Individual Talent, 1919.

I

IN English writing we seldom speak of tradition, though we occasionally apply its name in deploring its absence. We cannot refer to “the tradition” or to “a tradition”; at most, we employ the adjective in saying that the poetry of So-and-so is “traditional” or even “too traditional.” Seldom, perhaps, does the word appear except in a phrase of censure. If otherwise, it is vaguely approbative, with the implication, as to the work approved, of some pleasing archæological reconstruction. You can hardly make the word agreeable to English ears without this comfortable reference to the reassuring science of archæology. 1

Certainly the word is not likely to appear in our appreciations of living or dead writers. Every nation, every race, has not only its own creative, but its own critical turn of mind; and is even more oblivious of the shortcomings and limitations of its critical habits than of those of its creative genius. We know, or think we know, from the enormous mass of critical writing that has appeared in the French language the critical method or habit of the French; we only conclude (we are such unconscious people) that the French are “more critical” than we, and sometimes even plume ourselves a little with the fact, as if the French were the less spontaneous. Perhaps they are; but we might remind ourselves that criticism is as inevitable as breathing, and that we should be none the worse for articulating what passes in our minds when we read a book and feel an emotion about it, for criticizing our own minds in their work of criticism. One of the facts that might come to light in this process is our tendency to insist, when we praise a poet, upon those aspects of his work in which he least resembles anyone else. In these aspects or parts of his work we pretend to find what is individual, what is the peculiar essence of the man. We dwell with satisfaction upon the poet’s difference from his predecessors, especially his immediate predecessors; we endeavour to find something that can be isolated in order to be enjoyed. Whereas if we approach a poet without this prejudice we shall often find that not only the best, but the most individual parts of his work may be those in which the dead poets, his ancestors, assert their immortality most vigorously. And I do not mean the impressionable period of adolescence, but the period of full maturity. 2

Continue reading..

Source: National Academy of the Arts, Oslo.
Text: Wikipedia article, https://en.wikipedia.org/wiki/Tradition_and_the_Individual_Talent.
All images belongs to the respective artist and management.

Read Full Article

Post-Capitalism: The End of Capitalism Has Begun

Without us noticing, we are entering the postcapitalist era. At the heart of further change to come is information technology, new ways of working and the sharing economy. The old ways will take a long while to disappear, but it’s time to be utopian.

he red flags and marching songs of Syriza during the Greek crisis, plus the expectation that the banks would be nationalised, revived briefly a 20th-century dream: the forced destruction of the market from above. For much of the 20th century this was how the left conceived the first stage of an economy beyond capitalism. The force would be applied by the working class, either at the ballot box or on the barricades. The lever would be the state. The opportunity would come through frequent episodes of economic collapse.

Instead over the past 25 years it has been the left’s project that has collapsed. The market destroyed the plan; individualism replaced collectivism and solidarity; the hugely expanded workforce of the world looks like a “proletariat”, but no longer thinks or behaves as it once did.

If you lived through all this, and disliked capitalism, it was traumatic. But in the process technology has created a new route out, which the remnants of the old left – and all other forces influenced by it – have either to embrace or die. Capitalism, it turns out, will not be abolished by forced-march techniques. It will be abolished by creating something more dynamic that exists, at first, almost unseen within the old system, but which will break through, reshaping the economy around new values and behaviours. I call this post capitalism.

As with the end of feudalism 500 years ago, capitalism’s replacement by postcapitalism will be accelerated by external shocks and shaped by the emergence of a new kind of human being. And it has started.

Postcapitalism is possible because of three major changes information technology has brought about in the past 25 years. First, it has reduced the need for work, blurred the edges between work and free time and loosened the relationship between work and wages. The coming wave of automation, currently stalled because our social infrastructure cannot bear the consequences, will hugely diminish the amount of work needed – not just to subsist but to provide a decent life for all.

Second, information is corroding the market’s ability to form prices correctly. That is because markets are based on scarcity while information is abundant. The system’s defence mechanism is to form monopolies – the giant tech companies – on a scale not seen in the past 200 years, yet they cannot last. By building business models and share valuations based on the capture and privatisation of all socially produced information, such firms are constructing a fragile corporate edifice at odds with the most basic need of humanity, which is to use ideas freely.

Third, we’re seeing the spontaneous rise of collaborative production: goods, services and organisations are appearing that no longer respond to the dictates of the market and the managerial hierarchy. The biggest information product in the world – Wikipedia – is made by volunteers for free, abolishing the encyclopedia business and depriving the advertising industry of an estimated $3bn a year in revenue.

Almost unnoticed, in the niches and hollows of the market system, whole swaths of economic life are beginning to move to a different rhythm. Parallel currencies, time banks, cooperatives and self-managed spaces have proliferated, barely noticed by the economics profession, and often as a direct result of the shattering of the old structures in the post-2008 crisis.

Read Full Article

Curiosity Rover: Finds Biologically Useful Nitrogen on Mars

pia19105-mahli_sol-867curiosuty-selfie-with-holes

View from the Mars Hand Lens Imager camera on the arm of NASA’s Curiosity Mars Rover and Curiosity Self-Portrait at Mojave Site on Mount Sharp, Mars, 2015

A team using the Sample Analysis at Mars (SAM) instrument suite aboard NASA’s Curiosity rover has made the first detection of nitrogen on the surface of Mars from release during heating of Martian sediments. The nitrogen was detected in the form of nitric oxide, and could be released from the breakdown of nitrates during heating. Nitrates are a class of molecules that contain nitrogen in a form that can be used by living organisms. The discovery adds to the evidence that ancient Mars was habitable for life.

Nitrogen is essential for all known forms of life, since it is used in the building blocks of larger molecules like DNA and RNA, which encode the genetic instructions for life, and proteins, which are used to build structures like hair and nails, and to speed up or regulate chemical reactions.
However, on Earth and Mars, atmospheric nitrogen is locked up as nitrogen gas (N2) – two atoms of nitrogen bound together so strongly that they do not react easily with other molecules. The nitrogen atoms have to be separated or “fixed” so they can participate in the chemical reactions needed for life. On Earth, certain organisms are capable of fixing atmospheric nitrogen and this process is critical for metabolic activity. However, smaller amounts of nitrogen are also fixed by energetic events like lightning strikes.

Features resembling dry riverbeds and the discovery of minerals that only form in the presence of liquid water suggest that Mars was more hospitable in the remote past. The Curiosity team has found evidence that other ingredients needed for life, such as liquid water and organic matter, were present on Mars at the Curiosity site in Gale Crater billions of years ago.

“Finding a biochemically accessible form of nitrogen is more support for the ancient Martian environment at Gale Crater being habitable,” said Jennifer Stern of NASA’s Goddard Space Flight Center in Greenbelt, Maryland. Stern is lead author of a paper on this research published online in the Proceedings of the National Academy of Science March 23.

“Scientists have long thought that nitrates would be produced on Mars from the energy released in meteorite impacts, and the amounts we found agree well with estimates from this process,” said Stern.

The team found evidence for nitrates in scooped samples of windblown sand and dust at the “Rocknest” site, and in samples drilled from mudstone at the “John Klein” and “Cumberland” drill sites in Yellowknife Bay. Since the Rocknest sample is a combination of dust blown in from distant regions on Mars and more locally sourced materials, the nitrates are likely to be widespread across Mars, according to Stern.

Read Full Article

The Dark Web: It Sounds Sinister and It Certainly Does Hide a Multitude of Very Dark Dealings

It’s a technological arms race, pure and simple.

That’s how Jamie Bartlett, author of The Dark Net, sums up the constantly evolving battle in cyberspace between terrorists and the intelligence agencies trying to discover their hidden communications.

“The unbelievable growth in widely available (encryption) software will make their job much harder,” he said. “What it will mean is a shift away from large-scale traffic network analysis to almost old-fashioned intelligence work to infiltrate groups – more and people on the ground as opposed to someone on a computer in Cheltenham.”

In the Second World War there was Enigma, the German cipher machine eventually decoded by Britain. There was also steganography, the art of shrinking and concealing information inside objects such as microdots, usually only detectable by those who knew exactly where to look.

In Cold War days spies sat next to each other on park benches or left secret messages to be picked up later in “dead letter drops” behind objects such as flowerpots or in crevices in walls.

In the 1990s extremist groups used satellite phones and faxes to communicate, with paper messages from Osama Bin Laden in Afghanistan churning out of a fax machine operated in North London by his UK representative. Already that sounds almost prehistoric.

For close to two decades now the internet has been the river through which most terrorist communications flow, hiding amongst the legitimate, the ordinary, the innocuous or the just conventionally criminal.

The dark web sounds sinister and it certainly does hide a multitude of very dark dealings. But the sheer volume of ordinary people now using it as a matter of course have inevitably pulled it closer into the mainstream of digital communications.

The more people who use it, the easier, in theory, it will be for terrorists to hide their own messages amongst its terabytes of data. But the dark web does have benign uses and while it presents a growing challenge to counter-terrorism authorities this is a phenomenon that is unlikely to disappear anytime soon.

http://www.bbc.com/news/technology-31948818

Read Full Article

Discourse: A Hypersexual Society

Kenneth C.W. Kammeyer, A Hypersexual Society: Sexual Discourse, Erotica, and Pornography in America Today. Palgrave Macmillan, 2008

America today is a hypersexual society. Sexual discourse, erotica, and pornography are pervasive in the culture. Sexual materials, many times extending into erotica and pornography, are found in the consumer world, academia, sex therapy, the publishing world, mass media (especially radio, television and movies) and the Internet. The sexual materials found in all these areas of American society provoke relentless opposition by groups and individuals who want to repress or censor sexual materials. The combined effects of those who promote and produce sexual materials, and those who try to supress them, add up to a cacophony of sexual discourse.

Hypersexuality is a clinical diagnosis used by mental healthcare researchers and providers to describe extremely frequent or suddenly increased sexual urges or sexual activity.

The Merriam-Webster Dictionary defines hypersexual as “exhibiting unusual or excessive concern with or indulgence in sexual activity.” Sexologists have been using the term hypersexuality since the late 1800s, when Krafft-Ebing described several cases of extreme sexual behaviours in his seminal 1886 book, Psychopathia Sexualis. The author used the term “hypersexuality” to describe conditions that would now be termed premature ejaculation.

Hypersexuality may be a primary condition, or the symptom of another medical disease or condition, for example Klüver-Bucy syndrome or bipolar disorder. Hypersexuality may also present as a side effect of medication such as drugs used to treat Parkinson’s disease. Clinicians have yet to reach a consensus over how best to describe hypersexuality as a primary condition, or to determine the appropriateness of describing such behaviors and impulses as a separate pathology.

Some authors have questioned whether it makes sense to discuss hypersexuality at all, arguing that labeling sexual urges “extreme” merely stigmatizes people who do not conform to the norms of their culture or peer group.

Hypersexual behaviours are viewed variously by clinicians and therapists as: an addiction; a type of obsessive-compulsive disorder (OCD) or “OCD-spectrum disorder”; or a disorder of impulsivity. A number of authors do not acknowledge such a pathology and instead assert that the condition merely reflects a cultural dislike of exceptional sexual behavior.

Consistent with there not being any consensus over what causes hypersexuality, authors have used many different labels to refer to it, sometimes interchangeably, but often depending on which theory they favor or which specific behavior they were studying. Contemporary names include compulsive masturbation, compulsive sexual behavior, cybersex addiction, erotomania, “excessive sexual drive”, hyperphilia, hypersexuality, hypersexual disorder, problematic hypersexuality, sexual addiction, sexual compulsivity, sexual dependency, sexual impulsivity, “out of control sexual behavior”, and paraphilia-related disorder.

https://books.google.com/books?id=mQjIAAAAQBAJ&pg=PA12&lpg=PA12&dq=a+hypersexual+society&source=bl&ots=KYZ7lm9g-2&sig=E2q5FvKU6_9OPgXulGtgSclkgaA&hl=no&sa=X&ei=PdEeVejGE8GvsQHtjYLACg&ved=0CEcQ6AEwBg#v=onepage&q=a%20hypersexual%20society&f=false

Read Full Article

AIDS research: Cured of HIV?

Boxes of antiretroviral medicines sit on

From the printed edition of The Economist, March 19, 2013

In journalism, cynics suggest, three data points are enough for a trend. As of March 4th, AIDS researchers hope two might be sufficient. On that day Deborah Persaud of Johns Hopkins University announced to the Conference on Retroviruses and Opportunistic Infections, in Atlanta, Georgia, that a patient under her care had been cured of HIV infection. The announcement was hedged with caveats (“functionally cured” was the exact term used). But the bottom line was clear. Dr Persaud thinks her patient, a two-and-a-half-year-old girl, has joined Timothy Brown, a man known to many as the “Berlin patient”, as a human who was once infected with HIV and now no longer is.

The girl was born infected because her mother was infected but was not under treatment at the time (which would normally prevent mother-to-child transmission). She was given standard anti-retroviral drugs almost immediately and for 18 months afterwards. Doctors then lost track of her for five months and when she returned to their attention, they found the virus had vanished. Half a year later, despite the fact that she is no longer taking anti-AIDS medicine, there is no sign of HIV having returned.

This is a result of great potential significance. Mr Brown’s cure was effected because his bone marrow (and thus the pertinent part of his immune system, which HIV infects) was destroyed and replaced during a course of treatment for leukaemia. That is hardly a viable approach for most people. But if HIV infection can be cured with drugs, as Dr Persaud’s observations suggest, a whole new line of investigation opens up.

http://www.economist.com/news/science-and-technology/21573087-american-child-seems-have-been-cured-hiv

Read Full Article

Online Resource: http://www.ubu.com/

Bruce Nauman, Good Boy Bad Boy,1985 and Video Against AIDS, Curated by John Greyson and Bill Horrigan, produced by Kate Horsefield, 1989

UbuWeb is an independent online resource educational resource for avant-garde material. UbuWeb does not distribute commercially viable works but rather resurrects avant-garde sound art, video and textual works through their translation into a digital art web environment, re-contextualising them with current academic commentary and contemporary practice

All materials on UbuWeb are being made available for noncommercial and educational use and the service is completely free.

http://www.ubu.com/

Read Full Article

Brave new world, Mars, 6 August 2012

Curiosity’s first color image of the Martian landscape, August 6, 2012

Mars Science Laboratory (MSL) is a robotic space probe mission to Mars launched by NASA on November 26, 2011, which successfully landed Curiosity, a Mars rover, in Gale Crater on August 6, 2012. The overall objectives include investigating Mars’ habitability, studying its climate and geology, and collecting data for a manned mission to Mars. The rover carries a variety of scientific instruments designed by an international team.

MSL successfully carried out a more accurate landing than previous spacecraft to Mars, aiming for a small target landing ellipse of only 7 by 20 km, in the Aeolis Palus region of Gale Crater. This location is near the mountain Aeolis Mons (a.k.a. Mount Sharp). The rover mission is set to explore for at least 687 Earth days (1 Martian year) over a range of 5 by 20 km.

The Mars Science Laboratory mission is part of NASA’s Mars Exploration Program, a long-term effort for the robotic exploration of Mars that is managed by the Jet Propulsion Laboratory of California Institute of Technology. The total cost of the MSL project is about US$2.5 billion.

Previous successful U.S. Mars rovers include Spirit and Opportunity, and Sojourner from the Mars Pathfinder mission. Curiosity is about twice as long and five times as heavy as Spirit and Opportunity Mars exploration rover payloads of earlier U.S. Mars missions, and carries over ten times the mass of scientific instruments.

http://en.wikipedia.org/wiki/Mars_Science_Laboratory#EDL_event.E2.80.93August_6.2C_2012

Read Full Article

Escaping Earth: Kepler-22b, The first planet found in the “habitable zone”

NASA’s Kepler Mission Announces Latest Planetary Discovery

The Kepler mission’s science team announced its latest finding at a press conference on Monday, Dec. 5, 2011. The team announced the confirmation of Kepler-22b, its first planet found in the “habitable zone,” the region where liquid water could exist on a planet’s surface. The planet is about 2.4 times the radius of Earth, orbits around a star similar to our sun and is located 600 light-years away.

Scientists don’t yet know if Kepler-22b has a predominantly rocky, gaseous or liquid composition, but its discovery is a step closer to finding Earth-like planets. The planet’s host star belongs to the same class as our sun, called G-type, although it is slightly smaller and cooler.

http://www.nasa.gov/kepler

Read Full Article

“Equally significant is the decline of the critic as a culture broker. Today that role is played by the curator. That, too, will change.”

helguera

In october 2005 Frieze asked 33 artists, collectors, critics, curators, educators and gallerists How has art changed?
With the proliferation of museums, biennales and fairs, and the sheer amount of work now being made, shown, and sold, the art world has obviously changed substantially over the last 40 or so years. But what have been the most important shifts in art and the structures that surround it?

Iwona Blazwick
A critic, art historian, lecturer, broadcaster and Director of the Whitechapel Art Gallery, London

The last four decades are both a legacy of the 1960s and a betrayal of their revolutionary potential. We can thank the Feminist and Civil Rights movements for making our art world massively more inclusive. The dematerialization of the object of art and its expansion into idea or phenomenon have made it possible for a text, an action or an environment to be understood as art and for Modernist realism to continue by other means. Early experiments with Super 8 and video laid the groundwork for the colonization of the art world’s time and space by the moving image, while the lens has attained equal status with the paintbrush. The entry of Structuralism, psychoanalysis and anthropology into theories of art has rocked the boat of aesthetics and evolved into an insistence on subjectivity and participation as integral to meaning. The spirit of collaboration and the co-option of empty property that was a hallmark of so many artists’ groups in the 1960s continues to live on generating a mobile but sustained network of laboratories for art. It’s an expanding field that has also become increasingly professionalized, commercialized and spectacularized. The last 40 years have marked the rise and proliferation of curators, collectors and architects specializing in making museums into powerful corporate brands that are intended to provide mass entertainment, generate tourism or solve social problems. Art has moved from margin to centre, with all the losses and gains that this entails.

Teresa Gleadowe
Director of the MA Curating Contemporary Art at the Royal College of Art, London.

In the summer of 1966 the Arts Council presented at the Tate Gallery ‘The Almost Complete Works of Marcel Duchamp’, the first major retrospective of the artist’s work to be held in Europe. It was selected by Richard Hamilton, who wrote the one-page introduction to a catalogue designed in classic Modernist style by Gordon House. With its Monotype Univers text, monochrome illustrations shown approximately to scale, modest selection of six colour plates and discreet scholarly tone, this understated but informative publication speaks eloquently of a particular set of assumptions about the purpose of art. There is no expectation of box office or of universal appeal. Forty years on the continuing influence of Duchamp on art theory and practice is still felt, but the world of art is no longer a separate sphere. It is now permeated by the art market, by business and political interests, and by the values of the ‘creative industries’. The museum building boom of the late 1980s continues, linked to agendas of economic regeneration. Biennials have proliferated, and supranational museum brands have been established. The audience for art has exploded in size, with a growing emphasis on diversity and education. Electronic communication has increased the speed at which an exhibition or publication can be realized, and international travel has become an essential component of the activity of the artist or curator. Art has become a globalized field, no longer bounded by the physical presence of the work of art.

Dave Hickey
An art critic who lives in Las Vegas, Nevada.

First, I am not a part of this any more. I am still writing but obsolete, a dead man walking. Most of my younger colleagues, the art critics who should be in the ascendant today, died of AIDS in the 1980s. Those that survive are academics or tabloid celebrity geeks. Art dealers who once represented an informed aesthetic now show one of each. Museum directors and curators who once proudly promoted an engaged view of things now show one of each. Magazines that once implemented an informed agenda, now publish one of each, pro and con. With the exception of collectors, everyone in the art world today is either a public servant or journalist, a poll watcher or a bean counter, implementing ‘fairness’. With the defection of critics, curators, museum directors and editors from the realm of informed decision-making, only collectors vote on new art, so they drive the market, which, as a consequence, is radically front-loaded, frivolously quixotic and egregiously sentimental. Other than that, everything is peachy.

Matthew Higgs
Director and Chief Curator of White Columns, New York.

My feeling is that only a nostalgic (or curmudgeon) would argue that things were better (or, even, more interesting) in the 1960s, 1970s, 1980s or 1990s. ‘Right now’, i.e., the present tense, is always the best time for what we do. Certainly there have never been more people interested in and involved with contemporary art, which is a good thing. (‘Art for all’, as Gilbert & George might say.) I know that some people grumble about art fairs and biennials, but these are probably the same kind of people who would grumble about their favourite bands becoming popular. Technology insists that things happen and are absorbed a lot quicker these days, but that’s OK too. (Only romantics would have it another way, and don’t forget: Ars longa vita brevis.) What’s sad? Call me old-fashioned, but art seems to be losing its regional dialects and accents, becoming instead a kind of visual Esperanto, but, hey, you can’t have everything. What’s bad? The art world remains too professional and too bourgeois … some things, I guess, will never change.

Olu Oguibe
An artist who has exhibited his work in biennials and triennials around the world, and also curated major exhibitions for numerous spaces, including Tate Modern and the Venice Biennale.

Over the last 40 years contemporary art has witnessed few significant changes besides the numerous trends and fads that provide the art world with much-coveted entertainment. Photography gained prominence. Video art failed to fulfil its early promise. The much-touted dematerialization of the art object proved to be a mere discursive fantasy, as even ‘new media’ artists continue to privilege the marketable object. Eventually cynicism has replaced genuine curiosity and engagement even in post-colonial contemporary art. More significant, however, is the consolidation of women artists’ place in discourse, display, documentation and practice. Despite continued gender disparity in visibility and remuneration, women artists have registered their presence beyond contest or erasure. This is particularly important because younger artists can now take the possibility of success and recognition for granted. Equally significant is the decline of the critic as a culture broker. Of course, critics remain important arbiters of taste, but the all-powerful, fate-determining critic that emerged especially in mid-century America and brokered careers, movements and canonical paradigms, leaving powerful imprints on the discourse of contemporary art, is no more. Today that role is played by the curator. That, too, will change. Regrettably, artistic autonomy has also declined. In the 1960s and early 1970s artists boldly and consciously distanced themselves from the establishment, and in the process opened refreshing avenues for expression. However, that independence is all but completely ceded today as artists jostle for position and jockey to mortgage their work, careers and convictions for success, thus relegating themselves to pawns in the culture game.

http://www.frieze.com/issue/article/how_has_art_changed/

Read Full Article

Talks: The Trouble with Productivity, ICA

productivity-1

The Trouble with Productivity, 11 January 2012
Institute of Contemporary Arts

Artists, writers and curators today, more than ever, take part in a time-pressured culture of high performance.

Can you be productive by not being productive? Are there artistic possibilities in exhaustion, failure and laziness? Those were among the questions posed at the ICA last week during a discussion between the writer and critic Laura McLean-Ferris, the curator Paul Pieroni, and the writer and philosopher Lars Iyer (all of whom are unusually productive, we audience members couldn’t help noticing).

McLean-Ferris opened with reference to a number of articles written a few years ago when the art market was at an “overblown” level: Dan Fox observed in Frieze Magazine (of which he is associate editor) that the art world had developed into “a high-turnover, high-visibility international activity that everyone wants a slice of”, and expressed an uneasiness about galleries’ sleek corporate architecture and vague but authoritative-sounding art-speak, which he suspected to be somewhat at odds with the ways in which artists actually work. In a piece entitled “I Can, I Can’t, Who Cares”, the critic and curator Jan Verwoert, troubled by the relentless pressure on creative types to “perform”, looked around for interesting and amusing examples of “unwillingness, non-compliance, uncooperativeness, reluctance or non-alignment”.

One example is the work of the Croatian artist Mladen Stilinović, whose photographic series “The Artist at Work” (1978) shows him lying in bed. Stilinović is the author of a manifesto, “In Praise of Laziness” (1993), in which he describes laziness as “the absence of movement and thought, dumb time – total amnesia. It is also indifference, staring at nothing, non-activity, impotence. It is sheer stupidity, a time of pain, futile concentration”. Those “virtues of laziness” are important factors in art, he says: “Knowing about laziness is not enough; it must be practised and perfected”.

McLean-Ferris pointed out that Herman Melville’s Bartleby the Scrivener – who discombobulates his Wall Street colleagues with the refrain “I would prefer not to” – has become a sort of model for non-compliance among artists. Étienne Chambaud produced a neon sign that reads “I would prefer not to too” (2007), which never gets switched on. Pilvi Takala spent a month in the marketing department of the accountancy firm Deloitte, where she filmed herself doing nothing all day for her installation The Trainee (2008). Only a few people knew that she was not an ordinary worker. As it says on Takala’s website: “these acts, or rather the absence of visible action, slowly make the atmosphere around the trainee unbearable . . . . What provokes people in non-doing, alongside strangeness, is the element of resistance. The non-doing person isn’t committed to any activity, so they have the potential for anything”.

Paul Pieroni was keen in his talk to “recalibrate ideas about how we might understand procrastination”. While acknowledging that it can be straightforwardly evasive – he confessed that when he was supposed to be preparing for his ICA appearance he went out and bought a carbon monoxide detector, having created ”a ridiculous context where I might die” – he said that procrastinatory “counter-activity” can be important in art production, “especially when we’re not sure what the priorities of our actions are”.

Counter-activity can have value even if the original goal is discarded entirely, he said: his friend Mike Harte, who at one time aspired to be an artist but always seemed to find something else to do (sitting about, karaoke, eating Choc Dips – we were treated to a little slide show), used to write letters full of trivia, newspaper cuttings and “general nonsense” to his friend Jamie Shovlin, who was then studying at the Royal College of Art. Shovlin kept the letters and made them into an art project of his own, Mike Harte – Make Art, which was commissioned by the collective art collection V22. Pieroni has lately been researching what he calls the effluvia of office work – which include humorous pictures and signs such as the ones on this page (other slogans include “A tidy desk is a sign of a sick mind”) – for the exhibition Xeroxlore, which opens tomorrow, January 20, at SPACE.

Lars Iyer’s subject was the common view of consumers versus producers: consumers are suggestible and easily manipulable, while producers are in control, in charge. He finds the modern notion of the producer problematic: in an age of neoliberal capitalism, he said, to be a producer is to be a capitalist entrepreneur; and very often what producers are marketing – through social media, for example – are not artworks, but themselves. The ICA audience, a large, mostly young crowd (Mike Harte was there – I spotted him in the bar afterwards), asked some interesting questions in relation to this; McLean-Ferris answered one about productivity and internet use with the comment that people who upload a lot of material tend to be particular kinds of people, who can set the agenda in ways that may not be obvious.

Iyer suggested in a recent article in the White Review that great literature was over, and that modern writers were sullied by their active involvement with the marketplace. At the ICA he told us that if writing is to be successful today, it should respond “to the ways in which we arse about when no one’s watching us”. Iyer’s compulsively readable, funny and touching novel Spurious (2011) started as an attempt, in the form of a blog, to answer a serious philosophical question. “What I found myself doing instead”, he told us, “was recording the stupid conversations I’d had with a friend of mine”.
Read Full Article

Narcissism is back in fashion

Who, Me?
Pretty pretty good

Narcissism is back in fashion

It may come as a surprise, but narcissism was facing extinction by 2013. That’s when the fifth edition of the Diagnostic and Statistical Manual of Mental Disorders – also known as the DSM–V – was due to appear without its traditional entry on the Narcissistic Personality Disorder (NPD). Published by the American Psychiatric Association, the manual is the reference for American practitioners (the rest of the world relies on the World Health Organization’s International Classification of Diseases). In the current DSM–IV (1994), NPD is one of ten personality disorders. But in the forthcoming edition, these were to be reduced to only five basic types: antisocial, avoidant, borderline, obsessive–compulsive and schizotypal.

In The New York Times, the Harvard psychiatrist Dr. John Gunderson called the change ‘draconian’. It seems to be part of the American trend to make mental disorders biological (and thus treatable with drugs) instead of relying on old psychoanalytic terms and the traditional talking cure, which even Freud once described as ‘interminable’. Last summer, the American Psychiatric Association must have buckled under the pressure; according to its website, narcissism has been saved from woblivion and is back in the mix of personality disorders for the next edition of DSM–V.

For narcissists, it’s a victory they may be hard-pressed to acknowledge (‘Me? A narcissist?!’). I find the volte-face dismaying, not because I’m for prescribing drugs and against talking cures. You don’t need to be a psychiatrist or a psychoanalyst to see that narcissism has shifted from a pathological condition to a norm, if not a means of survival. Melancholy also shifted historically from a medical illness to a state of mind. But narcissism appears as a necessity in our society of the spectacle, which runs from Andy Warhol’s ‘15 minutes of fame’ prediction through reality TV and self-promotion to today’s YouTube hits. While the media and social media played a role in normalizing narcissism, photography has played along with them. We exist in and for society, only once we have been photographed. The photographic portrait is no longer linked to milestones like graduation ceremonies and weddings, or exceptional moments such as vacations, parties or even crimes. Photography has become part of a daily, if not minute-by-minute, staging of the self. Portraits appear to have been eclipsed by self-portraits: tweeted, posted, shared.

A brief historical review of attitudes towards narcissism and self-portraiture may be in order. According to Greek mythology, Narcissus was the man who fell in love with his reflection in a pool of water. According to the DSM–IV, 50–70 percent of those diagnosed with NPD are men. But according to my upbringing in Canada, looking at one’s reflection in a mirror for too long was a weakness particular to the fairer sex and an anti-social taboo. I recall doubting Cindy Sherman’s ‘Untitled Film Stills’ (1977–80): wasn’t she just a narcissist taking pictures of herself all day long? At least she was modest enough to use a remote shutter trigger. By contrast, Helmut Newton had openly displayed his camera when he captured his reflection in a bathroom mirror for Self-portrait, Lenox Hill Hospital, New York City (1973). Then again, the exceptional situation – his naked body hooked up to wires for an electro-cardiogram – made the self-portrait look like a moment of documentation.

There’s no modesty about staging the self in Patty Chang’sFountain (1999), although this work is a video and not a photographic self-portrait. A modern-day Narcissus, Chang gazes into a mirror covered by a thin layer of water and can’t stop licking it all up. Digital narcissism has recently gained attention in the popular press with Gabriela Herman’s portrait series ‘Bloggers’ (2010–11), which captures bloggers gazing into their glowing screens alone at night. But closer to our narcissistic norm are Wolfram Hahn’s portraits of people taking pictures of themselves (‘Into the Light’, 2009–10) or Joan Fontcuberta’s book A través del espejo (Through the Looking Glass, 2010), a collection of online images of self-portraitists posing with their cameras.

Unabashed self-portraiture has a purely formal side, which seems to have escaped recent discussions about photography. Today, no one bothers to use the remote shutter trigger or even the camera’s timer to make a self-portrait. We contemporary narcissists – me, myself and I – simply hold the camera or the phone in front of our faces and push the button. But this approach has led to a profound shift in the vanishing point, which has historically been understood as a point disappearing on the horizon in a landscape, whether drawn, painted or photographed. What disappears today is the hand of the photographer, holding the camera and aiming it at himself. While the hand lies outside the frame, the outstretched arm seems to vanish into the foreground. The vanishing point is not outside in the world and off in the distance, but on our own bodies. If we once directed our gaze outwards, we now look inwards and invite the world to watch as we lose ourselves.

Published in Frieze d/e Issue 143, November-December 2011. By Jennifer Allen

Read Full Article