Skip to content

“coolness as a form of economic labour”

13-Nov-25

Another recent article in The Guardian, Can I learn to be cool – even though I am garrulous, swotty and wear no-show socks?, tries to define “effortlessly, undeniably cool”. It pretty much articulates the notion that cool equals the physical appearance of popular-famous. Consumer fashion. The journalist, Elle Hunt, seeks reassurance she is cool – or at least has the possibility of being considered cool if she dresses the right way.

Once more, the starting point is Pezzuti, Warren and Chen’s paper in the Journal of Experimental Psychology that argues that there has been a universal crystallization of “the meaning of cool” and reduces it to six attributes: extroversion, openness, hedonism, adventurousness, autonomy and power.

Pezzuti’s interest is in coolness as a form of economic labour or production. His hypothesis is that – just as tribal societies prize skilled hunters, who provide food for the group – today’s information economy turns on creativity and innovation. “Cool” expresses the status and reward bestowed upon individuals who push boundaries, generate new ideas and promote their spread, to collective benefit.

While I don’t doubt that one of capitalism’s (or the dominant culture if you like) major achievements is assimilating and repurposing oppositional aspects – the counter-cultures become fuel for the dominant culture – the “collective benefit” Pezzuti recognises is that of making money. What’s most telling is that, when asked, Pezzuti cites Richard Branson as someone who he considers cool (and, more tellingly, identifies how this has contributed to making him a billionaire). For Pezzuti, you’re cool if you’re popular and rich. He insists that cool can’t be alienating (unlike historical cool).

Joel Dinerstein, professor of English at Tulane University and author of The Origins of Cool in Postwar America does a little better when he is quoted in the article as saying:

“It’s a combination of rebellion, personal style, otherworldly confidence and charisma … It’s actually a very mysterious calculus… a person who’s cool does not give a shit about what you think about them””

And, yet, Dinerstein follows Pezzuti be insisting that coolness is a commodity: “And in today’s consumer economy, to be cool, you must also be marketable.”

Dinerstein tries to separate current cool with what was previously cool and argues that whereas selling out was considered uncool, it’s now the reverse. “Once you are fully immersed in a consumer society, that outlaw sensibility can’t come from anywhere,” says Dinerstein and opines “At this point, cool is correlated with celebrity” and that – based on conversations with his students – only the famous can be cool. Somewhere in his argument is something vague about “perceived authenticity”.

Later in the article, the journalist visits Cora Delaney, founder and director of creative agency EYC Ltd who tells her that ““All the coolest people I know are hustlers,” Delaney says. “If you’ve just had it given to you, then it’s not that cool.””

Cool, to her, is about being an individual and “doing your own thing”.

It’s the conflation of celebrity, market capitalism and vague pop psychology that marks these types of discussions about what’s cool. What’s missing from these discussions of cool is the matter of taste. You have to have informed taste – which is pretty much always anti-mainstream culture (though it might incorporate and subvert aspects of the mainstream). People selling you stuff and telling you they are cool… are not cool. Neither are people desperate to be cool.

“the ding an sich of cool”

09-Jul-25

An article in The Guardian about the definition of Cool reports on findings by the American Psychological Society that there are six characteristics people perceive in cool people:

extroverted, hedonistic, powerful, adventurous, open and autonomous

The article goes on to briefly reference the history cool (and to be honest, I’d never thought about where the idea of cool came from) such as the “aristocratic cool” of Renaissance Europeans:

Some suggest that sprezzatura, an Italian word first used about by Baldassare Castiglione in 1528 and defined as “a certain nonchalance, to conceal all art and make what one does or say appear to be without effort” captures the earliest essence of what cool is.

And:

Cool as a characterisation originated from 1940s jazz culture, when the black musician Lester Young challenged racial norms by refusing to smile when performing. He also used fashion as a marker of defiance, wearing sunglasses indoors on stage. Not long after he coined the slang term “that’s cool”, his fans began to use it when referencing him.

(Here I ask whether “fashion” is actually an aspect of someone being cool; see instead below for consumerist cool.) Professor Joel Dinerstein of Tulane University Louisiana teaches a topic called The History of Cool and gives these as the qualities of cool:

“rebellious” and “charismatic”, flagging that another key quality is self-authorisation.

Chris Black – of the How Long Gone podcast – gives these definitions:

someone “being comfortable with who they are and what they say” is his “real baseline for coolness”

And:

“being very, very good at what they do”, saying it “shows a level of dedication and self-respect that I think is deeply cool”

(Which seems to me utterly subjective.) The writer of The Guardian article suggests that there is a wider societal mode of coolness:

Nowadays, social media means being cool is often less about a person and more about an aesthetic that can be carefully curated. Unlike a person, however, as soon as an item becomes mainstream, it is generally no longer deemed cool.

For me, Lester Young’s usage is closest to what I think is cool. It’s an authentic, autonomous expression of composure (or reserve) – a somewhat ironic detachment which distances itself from authority rather than directly confronting it. It also involves a state of calm which deliberately avoids conflict as Shakespeare has Queen Gertrude tell Hamlet: “O gentle son, upon the heat and flame of thy distemper, sprinkle cool patience”.

There’s a 1997 Malcolm Gladwell article in the New Yorker in which Gladwell observes the “trickle up” of fashion trends from the street and he posits that:

the first rule of the cool: The quicker the chase, the quicker the flight. The act of discovering what’s cool is what causes cool to move on

And:

Cool is a set of dialects, not a language.

And:

the essence of the third rule of cool: you have to be one to know one

And:

the second rule says that cool cannot be manufactured, only observed, and the third says that it can only be observed by those who are themselves cool. And, of course, the first rule says that it cannot accurately be observed at all, because the act of discovering cool causes cool to take flight, so if you add all three together they describe a closed loop, the hermeneutic circle of coolhunting, a phenomenon whereby not only can the uncool not see cool but cool cannot even be adequately described to them.

The trouble with Gladwell’s account and his rules is that they refer to a consumerist cool – which definitely isn’t cool. It’s not authentic and lacks naturalness; it’s a pose for other people to look and acknowledge the cool they themselves want to foster. A consumerist cool has to gaze upon its own sense of cool. A truly cool person wouldn’t give a shit about what anyone else thought. They just are. Ding an sich.

“instinct to conform”

17-Feb-25

Julian Simpson’s Substack post, On Authenticity, has struck a chord with me this morning. I enjoy Simpson’s blogging/newsletters in which he describes his insights as a scriptwriter and I’m an avid fan of his audiodramas. Simpson frequently describes the challenges he faces in the meda industry in taking creative ideas forward into production through the layers of bureaucracy and creative interference.

In On Authenticity Simpson recounts his struggle with a screenplay which isn’t working for him and he begins to realise that the problem with it is that he has written against his instincts and that he has

attempted the square-peg-round-hole trick; in an attempt to take a weird idea and make it palatable, I have tried to package the weird idea as something un-weird

This causes Simpson to reflect on the nature of authenticity (as a writer).

Authenticity is telling your story the way you want to tell it, the way you would enjoy seeing it unfold. Anticipating what someone else might like is the opposite thing. But being authentic always feels courageous, and there are times for all of us where courage deserts us. Or at least times, like this one, where we realise we got scared somewhere along the way and have been unconsciously playing it safe, to the detriment of the story.

He goes on to reflect that:

It’s really easy to be authentic when it doesn’t matter; when there aren’t bills to pay, when it doesn’t matter if this thing ever gets made or published. It’s much harder when you feel external pressure and your instinct is to conform to a perceived expectation, to not rock the boat.

It seems to me that Simpson pretty much articulates something that is wider than just about a struggle to negotiate an authentic creativity as part of a media production process. It’s how anyone can maintain a sense of their own authenticity (your notion of self; “your story the way you want to tell it”) when faced with conforming to life (job, career, relationship, “when there are bills to pay”, whatever).

For me, this notion of authenticity – which simultaneously is one of individual agency – is at the heart of how to live meaningfully. The “instinct to conform” seems to me to be totalising and oppressive in a way that we’ve not seen for nearly a century. (Though I’m willing to accept that gathering inauthenticity happens with age.) Is there a sliding scale of authenticity that meets happy compromise somewhere along the line? Zero hours contracts – the anathema of authenticity for who could ever live an authentic life tied to truly breadline-insecure employment? – would occupy one end of that scale. Or even more, the penury of long-term unemployment. But what would the other end – sincere authenticity – be? I’m not conviced that the right way of thinking about it or that describing it as “courageous” is neccessarily the right thing to do.

There’s also an element of thinking that true authenticity can only be a side-line hustle or attained by the wealthy. That most of us who have to scrape out a living in an inauthentic, capitalist economy are forced to be perpetually fragmentary inauthentic beings at the whims of our bosses and corporate overlords. (Maybe there’s truth in this, though, and that this contributes to the pandemic of mental health crises ravaging western capitalism.) That we can only attain brief moments of being authentic and not rock the boat too much in our lives. We can only be our authentic selves as children and that, in growing up, we allow ourselves to be synthetic, phony. (Cue thoughts about PKD regarding most people as androids.)

Inauthentic android? Or is authenticity just a matter of Do what thou wilt?

“brutes abstract not, yet are not bare machines”

13-Feb-25

Continuing to think out loud.

I’m still at the stage of clarifying (for me) precisely what is a state of attention. It seems to be a relatively modern, pragmatic concept which overlaps with ideas about mindfulness (often within a meditative religious framework), epistemology and cognitive functions – which all involve mental focus in controlling and directing the mind. Attention directs consciousness (or is it that consciousness directs attention?), it seems and, philosophically at least, plays a role in the development of a personal identity.

Importantly, most definitions of “attention” (especially if viewed as an active process in identity) require an element of active, reflective focus and concentration: we need to be aware of what we are doing in order to maintain our attention. (And attention requires focus on one thing at a time – a point made by Herbert A. Simon when he described humans as perating as serial devices.)That self-awareness through reflective activity is therefore a crucial foundation in the development of personal identity. Then there’s the issue of agency or, more simply, freedom. Quite simply, is the current age of distraction and inattention (or perhaps auto-directed attention) doing much more than wasting people’s time? Is it actually damaging their sense of self and a coherent narrative of their own lives?

So, I guess we’re back to Locke.

Locke’s concept of reflection presented in An Essay Concerning Human Understanding establishes a view that the inward mind in activity is a source of knowledge. It is:

The perception of the operations of our own mind within us, as it is employed about the ideas it has got.

Reflection, for Locke, requires a second-order, self-awareness when we perceive something (act of recognition requires some degree of attention).

Locke’s definition of personal identity:

Personal identity consists not in the identity of substance, but in the identity of consciousness.

is based on self-awareness and (reflective) consciousness and directly involves memory (linking the awareness of what is apprehended in the present to past experiences and existing knowledge).

A great deal of the second half of Book 2 is given over to Locke’s thoughts about the Will and its relationship with freedom/liberty:

What is it determines the will? the true and proper answer is, The mind. For that which determines the general power of directing, to this or that particular direction, is nothing but the agent itself exercising the power it has that particular way.

(Will, Locke insists, is brought into being by what he calls “the Uneasiness of Desire” by the mind. The removal of “Uneasiness” brings happiness – and the pursuit of true happiness is the foundation of Liberty.)

So where am I going with this?

I wonder to what extent this sort of thinking about personal identity influences current concerns about distraction/attention/digital tech? If a reflective self-directed, self-aware attention is persistently being hijacked by the distractions of things like new (social) media and smartphones, is it eroding or fragmenting a coherent sense of personal identity? Looking at the state of politics and the obvious role of new technologies in shaping and influencing opinions on a mass, almost global scale is a (Lockean) sense of a coherent personal identity at risk?

Considering doomscrolling, this point in the Essay:

if a man sitting still has not a power to remove himself, he is not at liberty

“to buffer it from the overrich environment in which it swims”

09-Feb-25

The current debate about the effects of digital technology – and social media in particular – often centres on the detrimental effects it has on our attention, both in terms of our ability to concentrate on tasks in a sustained way and in terms of the way in which our attention is grabbed by sensationalism and misinformation. It’s hard not to get caught up in this latest moral-panic and become convinced that technology is supremely damaging and dangerous (being compared with hard drugs and tobacco or even having altered the nature of capitalism itself!).

Certainly, it’s hard not to feel overwhelmed by a constant information deluge and a sense that one’s individual agency is increasingly being snatched away. There’s always too much to read and watch and look at and listen to and think about and reply… The infamous “algorithms” who are frequently blamed for all the ills of social media and digital consumption should really be the mechanisms for curating and buffering the information overwhelm. But as we know, in practice act as mechanisms for simply enabling more overwhelm.

It’s this idea of a “buffer” or “curation” of information that is central to the polymath political scientist’s, Herbert A. Simon’s 1971 paper, Designing Organizations for an Information-Rich World. 37 years before the first iPhone, Simon rang the alarm about a world in which – even then – the sheer amount of information was overwhelming the ability of individuals to process it. He argued that people only have a limited attention-span and presented what he saw as a shift from information-scarcity to attention-scarcity as the basis for what became known widely-known as the “attention economy”.

Simon foresaw the effect of overwhelming amounts of information as causing a “poverty of attention”:

In an information-rich world, the wealth of information means a dearth of something else: a scarcity of whatever it is that information consumes. What information consumes is rather obvious: it consumes the attention of its recipients. Hence, a wealth of information creates a poverty of attention.

One of the solutions Simon suggested was for the adoption of automated “attention-focusing” systems that enabled individuals to sift through data to identify what is important. He highlighted the role of experts and specialists in various domains who could process and interpret information to enable more efficient attention by others:

A great deal of the design of information-processing systems, therefore, will have to be concerned with the allocation of attention. The real task is not to provide more information, but to organize the sources of information so as to minimize the amount of attention that is required to extract the important information from the mass of unimportant information.

It’s this “allocation of attention” that is, for me, the crux of the matter. How and by whom are the vital questions. It requires great trust to be placed in the “experts” who would direct this focusing. And could this be ethically performed by AI or does it require a human mind to ensure the validity of moderation or curation? (I have daily RSS feeds of hundreds of items that clamour for my attention – and that’s before I get caught up with emails attempting to catch my attention, Youtube videos, podcasts, tv shows, social media, text messages…) Does managing the “allocation of attention” mean, as some argue, forms of digital detox – simply cutting oneself off from all the digital feeds (and, if we’re honest, the other non-internet bombarding forms of information overload)?

Is it, as Sherlock Holmes tells Watson in A Study in Scarlet, that people have to deliberately limit the amount of information to which they are exposed?

I consider that a man’s brain originally is like a little empty attic, and you have to stock it with such furniture as you choose. A fool takes in all the lumber of every sort that he comes across, so that the knowledge which might be useful to him gets crowded out, or at best is jumbled up with a lot of other things, so that he has a difficulty in laying his hands upon it. Now the skilful workman is very careful indeed as to what he takes into his brain-attic. He will have nothing but the tools which may help him in doing his work, but of these he has a large assortment, and all in the most perfect order. It is a mistake to think that that little room has elastic walls and can distend to any extent. Depend upon it there comes a time when for every addition of knowledge you forget something that you knew before. It is of the highest importance, therefore, not to have useless facts elbowing out the useful ones.

Am I foolishly taking in “all the lumber of every sort” with little regard? And is it actually doing me any real harm? Should I do more to ensure my attention-agency, to allocate my attention more purposefully?

Hello world!

06-Feb-25

Hello there (if you are reading this, of course). I’m in the process of setting up a super-simple, text-only site here. There’s absolutely nothing here other than this post and you are most welcome to read and re-read this post to your heart’s content.