Cognitive revelation

Rui Vale
16 min readDec 4, 2022
Magnifying glass

My reading notes on Gérald Bronner’s book Apocalypse cognitive[1] turned into a review — sort of.

I’m referring to the original French edition as there’s no English translation available. I found it a sobering analysis of humanity’s cacophonic predicament in the contemporary informational deluge.

Contrary to what one might suppose, the author is an almost Victorian modernist, a plain-vanilla rationalist who’s said to be one of France's main proponents of cognitive sociology.

The meaning he attributes to the word “apocalypse” is not the nowadays commonly given of a cataclysm, but the etymological one, of a “revelation”, a point he only makes halfway through the book while provocatively pointing to research that states 59% of all that’s shared in social media wasn’t even read[2].

The book is focused on the massive deregulation that in the past twenty or so years has swept what he calls the cognitive market, a market for ideas, which is now in an idealised — pun intended — state of perfect competition where the traditional gatekeepers lost their clout and no longer rule over or even curate what has become a massive trove of available information at everyone’s fingertips, mainly through the internet.

The individual entitlement to share and promote one’s representation of the world or anything about its affairs coupled with bestowed social legitimacy to participate in whatever debate is now a sociological fact.

There are no moats or barriers to entry in the cognitive market; whatever intellectual model can have a go at it, success or failure is mostly dependent on market dynamics such as the ones propelled by the factors of supply and demand.

The author, however, divides the assortment of all possible ideas on offer in the market into two domains: crude beliefs and methodical thinking.

To the assumption — and even theory-laden postulates[3] — those sound ideas, i.e., those backed by the canon of reason, will end up winning over unfounded beliefs such as superstitions, urban legends and conspiracy theories, the author offers reasons of concern based on available contrary evidence, referred to in his previous work[4] and throughout the book.

In a nutshell, the book deals with what’s considered humanity’s most precious belonging. Not a precious, magical ring but one’s brain, more specifically the ability to get a hold of it to think, rationally.

It then moves on to the assertive realization that each of us has one and, most importantly, onto the implications of the enormous potential of being so many of us together and getting along in such a planetary cocktail party.

And finally, that it has begun, the fight for the realization of thy potential, either through exponential credulity towards dumb extinction or through an essentially linear, painstaking progression of our civilization’s level of advancement along the Kardashev scale, ‘till we’ll get out of here, a la Musk.

Though I digressed a bit on that last paragraph, my main takeaway from the book is that we’re in a re-enacted, digital version of the universal flood myth with no ark in sight, where one has either to keep swimming, grow gills, or face a senseless fate under a zombified existence. Yes, I’m still digressing.

Here’s then to the human brain, physically and socially embodied.

Although the common notion of progress seems somewhat deflated today, brain time allotted to work during a lifetime has decreased from 48% in the 1800s to 11% nowadays and, thanks to the washing machine and the vacuum cleaner, time wasted on domestic chores also plummeted. That’s progress.

Such time savings began the moment our hairy, naked ancestors realised that sowing carries the expectation of harvest, a cause-effect nexus that can be predicted, hence planned and properly administered, its efficiency ensured through division of labour and authority; politics ensued, followed by more savings and loincloth.

A war chest of liberated time accrued; saved time that kept not just accumulating but also being diverted to other fancies, either recreational or innovative out of necessity, fuelling the exploration of possible future worlds. The added value, the surplus gained is therefore brain time, with the potential to be well spent.

The how then gradually replaced the why in one’s practical outlook on life, going from submission to domination; natural, eventful causes rule the world, not revealed capricious intents; a slowly changing interpretation operated with the unsuspected help of magicians, alchemists and even astrologists, up to Max Weber’s disenchantment with the rationalization of the world.

Cognitive outsourcing is now upon us as automation crossed the physical threshold by eating through intellectual activities with voracious indifference, not just the ones that can be algorithmically rendered but creative ones as well.

As to how this is going, a question lingers: what are we, the people, going to do?

Leaving that question aside, the point is that released brain time keeps mounting on, with life expectancy and medicine advances delaying age-related cognitive decay also adding up to it.

It’s a revolution to which scientific advancement is indebted; time to think, to intellectually engage in discovery is a necessary, though not sufficient, condition to bring it to fruition.

Finally, this capital of liberated thinking time has multiplied by a factor of eight since the 1800s. Good, so far.

But we’ve been getting less and less sleep[5]. We’re also becoming more and more interrupt-willing, like dozing sentinels or pairable devices in power save mode[6].

Sure enough, the recouped time from sleep isn’t but a very small fraction of the aforementioned increase of liberated time, but it can as well be an entry point to a phenomenon at the heart of the contemporary problem addressed in the book, the said cognitive revelation.

The main issue is that our attention’s siphoning through screens carries with it our mental availability, equating to a mere rerouting of the liberated time in a pervasive, infiltrating interpenetration of our daily routines, turning us into “smombies”, hijacked hosts of the smartphone[7].

It seems the cognitive market’s hypermodernity, by latching on to the ancestral, evolutionary hooks of the human brain, has become a battlefield where a significant part of humanity’s collective destiny is at stake, but in what terms?

Here’s now to the invisible gorilla at the global cocktail party.

In a world where 90% of the information in existence is two years old, the chance of a sensible, reasoned pick of the best tunnel to funnel one’s attention has become stochastic at best.

Under cognitive load one tends to get caught by instinctual, innate approaches to sifting. It becomes a pull process driven by the survival-driven hooks of fear, anger, sex, pop-out effects[8], in-group vs. out-group favouritism, and a ton of other biases, which the author sees as our species’ deep behavioural invariants being dutifully exploited by the manipulation of technological variables.

And the pull is exacerbated by the cocktail party effect, drawing people to the virtual circus-like arena through a built-in illusion of self-betterment of the particular kind that demands public recognition of the beauty of their souls. It’s an illusion that once bootstrapped in shock, swiftly teleports one to the disapproving amazement of others' lower deeds and sayings, beaming on indignation up to peaking at outrage escape velocity.

Nuance disappears under the extreme simplification afforded by online transparency, bringing everyone to what the author calls hyper-consequentialism, a notion picked up from Elizabeth Anscombe’s critique of consequentialism[9], a sort of Jeremy Bentham and Stuart Mill utilitarianism on steroids. It goes further than just being responsible for one’s unintentional mishaps, extending it to whatever fortuitous benefit that happens to fall in one’s lap, a chastise for detouring from Hans Jonas and James Lovelock's responsibility principle of always assuming the worst possible outcomes of human action. Under hyper-consequentialism, everything you might do defaults to bad, as much as in what you might abstain from, and everybody is watching.

This is then coupled with a sensibility epidemic that breeds a kind of enviable over-victimisation, demanding safe spaces for protection against disturbing contradiction, along with the so-called trigger warnings of threats to “intellectual security”.

The hyper-consequentialist notion of symbolic violence then grossly equates it to physical violence, making it a matter of degree instead of kind, lowering the threshold on issue salience and relevance by inflating its implicit definition and reaching, up to the point of becoming negatively correlated with the phenomenon’s real prevalence[10].

As the increased potential for agonistic reaction to whatever issue is put on the spot is realised under recurrent repetition, another effect kicks in, that of stimulus reduction from habituation, which by itself brings even further demands on the intensity of conflict, making it hyperbolic, up to a sort of Web re-wilding, close to a brutish, Hobbesian sense.

It’s well known that continued exposure to stimulus mutes the signal, yet humans are still caught in a dynamic tension between the cosy exploitation of the known and tamed, and the potentially dangerous but also opportunistic exploration of the unknown, hence the pervasively alluring value that novelty brings to information, epitomised in the cognitive market as clickbait, putaclic in French.


We’re even willing to subject ourselves to physical discomfort to overcome our renewed state of cognitive incompleteness[11]; however trivial, our curiosity cycles through impulsiveness, heightened intensity, and somewhat deceptive satiation.

Collectively, there’s this paradoxical melancholy attributed to democratic societies, where proclaimed equality of opportunities towards the realisation of one’s ambitions is promptly corrected by the everyday experience of pervasive inequality, hence of frustrated satisfaction. Notably identified by Alexis of Tocqueville in the American society of his time, it has since been heightened under the transparency afforded by the so-called parasocial interaction that reduced the already colloquial six degrees of separation to Facebook’s three-and-half[12].

Notoriety, according to Bourdieu is what singles out the individual from the obscure bottom of unattended indifference[13]. Such coveting has been found to encompass our cousins, the rhesus monkeys, shown willing to sacrifice a delicious delicacy for the chance of observing either the female perineum or the face of a higher-ranking male[14]. On our part, the internet brought personal branding anointed micro-celebrities[15].

To better understand these phenomena the author calls for an analytic framework based on a kind of non-naïve anthropology that takes into proper account our species’ invariants of built-in attunement to information which is egocentric, agonistic, fear or sex-related, a tendency kept in check throughout history by cultural forms of regulation such as censorship, religious taboos, geographic barriers to information dissemination, and benevolent paternalism.

Today, the attenuation and altogether removal of many of said cultural obstacles is allowing a virtually — yes, pun intended — unconstrained match of an offer to demand, the best and the worse of it arising from an exchange now driven by the aforementioned whims, to the point of blurring even the image we make of ourselves; a naïve anthropology, according to the author.

Such naïve anthropology can blind us to what the author named as an emergent form of editorialising of the world, a process characterised by the exploitative market logic of our primitive invariants under exposure to the accelerated savannah of the internet, where the novelty dynamics surpass what’s available in the real, analogic world, both in volume and speed, the latter underperforming the former in terms of the flow of solicitations and rewards[16].

Editorialising the world as an evaluative focus on one element of reality over another by establishing an order of relevance and importance in an interpretative narrative of human values is unavoidable, a given dimension of whatever discourse about the world. However, the dominant role played by traditional gatekeepers such as journalists, academics or any other notably legitimate actors has been shifted towards whoever offers the best fit to the anticipated demand, regardless of its statute.

Conventional media outlets, themselves being commercial enterprises, follow suit by adjusting to the new playbook in their struggle for survival. Countering the perhaps too naïve, even utopic outlook, increased market competition hasn’t led to more diversity. On the contrary, most of what’s published is copied (sometimes even verbatim) from previously published material.

The qualitative betterment of the information available expected from demand’s improved sophistication remains an unfulfilled promise of the internet. Maybe because there’s not enough time, as the effects of concentrated attention are bound to be sudden and gigantic, but very brief[17]. Known as the buzz, those spurs of attention seem to have a time span I’d metaphorically compare to the life cycle of a fruit fly. Likewise, augmented diffusion of media content hasn’t curtailed the homogenising phenomena commonly known as echo chambers, placing us in a kind of cognitive insularity, surrounded by an ever-expanding ocean of information.

Here’s now to the predicament the revelation has brought.

At this point in the book, you’d have taken a good look at the reflection of the baleful face of the chopped head of the Medusa — as it appears on the book’s cover — seeing yourself displaying that awful appetite for conflict, that lazy and avaricious cognition, along with a subservient stance to the injunctions of social visibility. What do you make of yourself?

In the last part of the book, the author advances three interpretations.

The first takes a misanthropic outlook on the human condition, one in which people are just flawed beyond repair; our mediocrity and meanness can’t be helped; a lost cause. This is an interpretation that according to him can only lead to withdrawal, one which he doesn’t subject to deeper analysis.

The second interpretation takes what’s revealed as artificial, a socially constructed artefact uncharacteristic of the human condition, an outcome of a denaturation process enacted by market forces, notably by neoliberalism. As such, it merits proper analytic consideration.

The third one takes this cognitive revelation on a bright side of sorts, reclaiming for it a political legitimacy from carrying a deep truth possessed by the majority, by the people against a small dominating elite attempting to devitalise it under the guise of reason and science. Denominated neo-populism, this interpretation is taken as a people’s sophism, yet worthy of careful analysis, according to the author.

The author cautions, market mechanisms notwithstanding, even with their powerful effects, what we’re dealing with isn’t but an aggregation of complex interactions leading to reciprocal adjustments, namely the opportunistic actions by actors whose success — and in many instances livelihood — depends on their capacity to draw attention.

Yet, following the anthropologist Wiktor Stoczkowski, one question that can be asked about this revelation is if the aforementioned human imperfections and outcomes thereof aren’t but ontological inscriptions of how things are, inevitable as such, or mere accidents of history[18].

Hence the “denaturalised man”, an unfortunate condition rendered by modernity, the industrial revolution, and insidious capitalism, as was soon hinted at by General Motors’ Charles Kettering in the 1920s by saying that the key to economic prosperity lies in the creation of organised dissatisfaction. Or by the mid-20th century Kenneth Galbraith’s assumption that goods made available elicit consumer needs, not the other way around. Or even Marcuse’s one-dimensional swallowing of the unsuspected man by the oppressive control of the technical apparatus, then swiftly teleported to Marx and Engels camera obscura of the dominant ideology under Adorno and Horkheimer’s lightning dialectic[19].

And the show goes on, picked up by Guy Debord’s overwhelming, spectacular contemplative images of objects of want, hijacking the comprehension of one’s existence and desires, followed by the technological, financially abstracted mediasphere that has since turned people into sentimental fools assuaged to mechanical docility under semiotic, fully standardised neuro-totalitarian capitalism[20].

It’s an interpretation the author finds hardly convincing as if coming from a naïve anthropology, portraying the contemporary human as if robbed of natural essence, making the observed behavioural phenomena completely artificially driven; one of the latest accidents of history. The remedial upside of it, which the author also finds misguided, is the assured hope in the possibility of change, resting on the assumption of the current state of affairs having been constructed in the first place, therefore amenable to social reform under properly guided political action.

Human cultural history may have actualized some of our species' behavioural potential, but surely that’s not all there is about it. One can easily concede that the current, unregulated cognitive market hasn’t exactly revealed our best side, but that shouldn’t lead to the belief such faulty facets aren’t actually our own and weren’t there all along reducing them to mere fixable social constructs. Doing so, the author alerts is doomed to fall at reality’s feet, as is always the case with anthropologically naïve hopes. Simply trying to abolish our species' fundamental traits can only end in a delusional form of alienation of which the historical record of the many attempted realisations of utopias is a prime example.

So, this cognitive revelation presents us with a sneering caricature of ourselves. Like all caricatures, it reveals some essential traits, neither entirely artificial nor a faithful depiction. Though real, all those salient traits exposed by the cocktail party effect aren’t the final word, much less the legitimately spontaneous cognitive expression of a betrayed people on the verge of liberation from the elites; the neo-populist interpretation.

The ongoing deregulation of the information market gave visibility to previously secluded aspirations to the ability to address the people directly, an ancient political fiction now so expediently enhanced by the much less restricted flow of information, almost solely subjected to free-market dynamics.

The neo-populist interpretation, though conceptually meagre, bears a strong core from the recurring idea of betrayal of the people by the corrupt elites, hence its buildup of a homogeneous collective will to pursue rightful, unmediated political action to counter the consubstantiated elites’ malfeasance.

The author borrows Edward T. Hall's concept of proxemics to describe this attempt to curtail the symbolic distance between the elected and the electorate, a kind of political proxemics supposedly attainable through translucent transparency, along with a renewed approach to democratic participation, featuring among others the demotion of experts in favour of people’s common sense. This is also known as disintermediation[21], bearing the assumption that removing mediation and weakening regulation unleashes the full power of cognitive demagogy. Far from blaming and loathing the externalities of the market, as the proponents of the denaturalisation of man do, neo-populists give those effects political legitimacy.

Applause metering contaminates belief with desire in a playing field where fiction also plays a big role, starting with a recreational version that gives solace to human cognitive incompleteness, it ends up rendering the actual meaning to our environment. As such, a mere story with no other pretension than exploring our imagination still rides one’s natural predispositions to thrive in the cognitive market.

Though yet eluding us as to how it works, a competitive market-bidding process gives currency to meaningful intents by social actors, breathing life into them as they become actions, movements, claims, as well as both constructive and destructive realisations.

Fiction can orient the real. Science fiction is a case in point, of how it inspired technological development by the likes of Jobs, Musk, and Bezos. And from there we’re now on to things like future casting or worldbuilding. By helping to think the unthinkable, fiction also gives it credence.

The storied fictions we carry in ourselves, or even the ones we’ve just grown accustomed to, guide us toward certain intellectual models that are spontaneously called upon to make sense of the world, first by pulling attention, then conviction, sometimes even contradictory to facts.

Both the myth of the denaturalised man and the neo-populist slogans hamper a required, sensible public debate, dwarfing the possibility of making the world a bit more intelligible. Hence the need for an alternative narrative and analytic space between these two poles, based on good old rationalism, or neo-rationalism, as posited by the author. That’s the book’s ambition.

A tough call, considering both poles are joined in the implicit detestation of rationality, blaming it for our present anxious times, as if we’re already living in a sort of Hell on Earth.

Make no mistake, our civilisation faces some big, existential risks: climate derailment, resource depletion, mutually assured destruction capability, and whatnot. Yet, the author believes there’s also good reason to count on the still many available possibilities.

Having gotten this far we owe it to not let ourselves go under, the avoidance of such bleak prospect coming from no other that our intellectual resources and our ability to engineer a collective intelligence capable of overcoming our shortcomings. For that to happen, surely among many other things, we’ll need to put our limited brain time to good use.

The book comes with a challenging and stimulating combination of examples and references, extending far beyond what can sensibly be captured in these already exceedingly long reviewing notes.

There are a couple of reviews of the book available; here’s a nice one, also in French[22], much more succinct than this one. There’s also this small review in English here, along with a translation sample of the book.

[1] Bronner, G. (2021). Apocalypse cognitive. Presses universitaires de France

[2] Maksym Gabielkov, Arthi Ramachandran, Augustin Chaintreau, Arnaud Legout. Social Clicks: What and Who Gets Read on Twitter? ACM SIGMETRICS / IFIP Performance 2016, Jun 2016, Antibes Juan-les-Pins, France. ⟨hal-01281190⟩

[3] Here he is referring to Raymond Boudon and his progressive, somewhat evolutionary, theory of ideas, notably in Le juste et le vrai : études sur l’objectivité des valeurs et de la connaissance, Paris, Fayard, 1995.

[4] Bronner, G. (2013). La démocratie des crédules. Presses Universitaires de France.

[5] Owens, J., Adolescent Sleep Working Group, & Committee on Adolescence (2014). Insufficient sleep in adolescents and young adults: an update on causes and consequences. Pediatrics, 134(3), e921–e932.

[6] Rosen, L., Carrier, L. M., Miller, A., Rokkum, J., & Ruiz, A. (2016). Sleeping with technology: cognitive, affective, and technology usage predictors of sleep problems among college students. Sleep health, 2(1), 49–56.

[7] Povolotskiy, R., Gupta, N., Leverant, A. B., Kandinov, A., & Paskhover, B. (2020). Head and Neck Injuries Associated With Cell Phone Use. JAMA otolaryngology — head & neck surgery, 146(2), 122–127.

[8] Hsieh, P. J., Colas, J. T., & Kanwisher, N. (2011). Pop-out without awareness: unseen feature singletons capture attention only when top-down attention is available. Psychological science, 22(9), 1220–1226. 1 comment on PubPeer (by: Statcheck ) 1 comment on PubPeer (by: Statcheck )

[9] Anscombe, G. E. M. (1958). Modern Moral Philosophy. Philosophy 33 (124):1–19.

[10] Levari, D. E., Gilbert, D. T., Wilson, T. D., Sievers, B., Amodio, D. M., & Wheatley, T. (2018). Prevalence-induced concept change in human judgment. Science (New York, N.Y.), 360(6396), 1465–1467.

[11] Lau, J. K. L., Ozono, H., Kuratomi, K., Komiya, A., & Murayama, K. (2020). Shared striatal activity in decisions to satisfy curiosity and hunger at the risk of electric shocks. Nature human behaviour, 4(5), 531–543.

[12] Bhagat S., Burke M., Diuk C., Filiz I. O. & Edunov (2016). ”Three and a half degrees of separation”. Research at Facebook.

[13] Bourdieu P. (1976). “Le champ cientifique”. Actes de la recherche en sciences sociales. The particularly troubled translation is mine.

[14] Deaner, R. O., Khera, A. V., & Platt, M. L. (2005). Monkeys pay per view: adaptive valuation of social images by rhesus macaques. Current biology: CB, 15(6), 543–548.

[15] Marwick, A. E. (2015). Instafame: Luxury Selfies in the Attention Economy. Public Culture, 27(1), 137–160.

[16] Lachaux, J. (2014). Chapitre 5. L’économie cérébrale de l’attention. Dans : Yves Citton éd., L’économie de l’attention: Nouvel horizon du capitalisme ? (pp. 109–120). Paris: La Découverte.

[17] Beauvisage T., Beuscart J-S, Couronné T. & Mellet K. (2013), « Le succès sur Internet repose-t-il sur la contagion ? Une analyse des recherches sur la viralité », Tracés. Revue de Sciences humaines, mis en ligne le 01 décembre 2013, consulté le 27 novembre 2022. URL : ; DOI :

[18] Stoczkowski, W. (2011). La double quête : un essai sur la dimension cosmologique de synthèses interdisciplinaires en sciences sociales. Nouvelles perspectives en sciences sociales, 7(1), 137–155.

[19] Horkheimer M. & Adorno T. W. (1982). Dialectic of enlightenment. Continuum.

[20] Berardi, F. (2014). Chapitre 8. Attention et expérience à l’âge du neurototalitarisme. Dans : Yves Citton éd., L’économie de l’attention: Nouvel horizon du capitalisme ? (pp. 147–160). Paris: La Découverte.

[21] Devecchio, A. (2019). Recomposition — Le nouveau monde populiste. Paris. Les Éditions du Cerf.

[22] Michelot, F. (2021). Review of [Bronner, G. (2021). Apocalypse cognitive. Presses universitaires de France]. Revue des sciences de l’éducation, 47(2), 274–276.



Rui Vale

Cherubinic debunker on business, life, and technology