ESRS: Back to Ball & Brown (1968)?

Posted by Thorsten Sellhorn - Aug 27, 2025
0
159

In the current debate on the European Sustainability Reporting Standards (ESRS), one theme stands out as both surprising and irritating: Few — with Christine Lagarde or PRI among the notable exceptions — seem seriously interested in understanding whether the hundreds of widely contested ESRS data points are actually useful to stakeholders.

The conversation is dominated by confident claims: “Most of these data points are useless.” “Nobody will ever use them.” “It’s disclosure overload.” These statements are made with remarkable certainty — yet, as far as I can tell, with little systematic empirical evidence, apart from anecdotal experiences of few ESG-related questions being posed in investor calls or at annual shareholders’ meetings. 

At the same time, preparers and users visibly disagree: Preparers tend to emphasize the cost and irrelevance of many ESRS disclosures. In contrast, at least some users see them as potentially informative. The gap between these perspectives is striking — and largely unexplored.

This situation strongly recalls Ball and Brown’s legendary 1968 study. For decades before them, accounting research had been normative, a-priori theorizing about how earnings should be calculated in order to matter. Ball and Brown, then Australian-born twenty-something Ph.D. students at the University of Chicago, were exasperated by the non-existent role for empirical evidence. In their 2014 retrospective, they wrote: “An analogy […] is given by motor industry R&D. It was as if academics were saying ‘[W]e have not collected data on whether or how consumers use cars as they currently are designed, but we know a priori that these cars are useless and all cars henceforth should be redesigned as 16-wheel trucks.’”

Consequently, they decided to break with that tradition. Instead of speculating, they asked what investors actually did with accounting numbers. Their evidence showed that earnings were informative to markets — not because they were timely or novel, but because they reflected facts that investors considered relevant for valuing companies. The real breakthrough was methodological: They had brought empirics into a field dominated by a-priori reasoning.

 

An alternative story: What if the contested ESRS data points are actually useful?

Let’s challenge the dominant narrative that most ESRS data is a waste of resources. Suppose serious user analysis were to reveal that many of the contested ESRS data points actually are useful — at least to some stakeholder groups (let alone that many of those that may not be useful today may well become useful in the future — when disclosure supply creates its demand). Suppose further that some of these data points expose uncomfortable truths for companies: emissions hot spots, workforce issues, supply-chain dependencies, or governance weaknesses. In such a world, at least some firms would have strong incentives to downplay the usefulness of this information.

What observable implications would we expect in that scenario?

  • We would expect a gap between voluntary disclosures and stakeholder demand: companies would be reluctant to publish precisely those data that are most informative (and potentially damaging) — probably including supply-chain data like upstream scope-3 GHG emissions, which are massive for many firms and over which their control is limited. (Witness the fierce resistance against scope-3-related disclosure requirements in the contexts of ESRS, ISSB standards, and the ill-fated SEC climate disclosure rule.)
  • We would expect the narrative “nobody asks for this” to become strategically attractive: if enough voices repeat that ESRS data points are useless, it legitimizes resistance and avoids uncomfortable transparency. See above: The argument of “no demand” has rarely held companies back from creating new products and services — and relying on advertising to “create” such demand. (By the way, in Nike’s income statement advertizing expense is currently labeled “demand creation expense.” In FY2024, it was $4,285m.)
  • We would expect claims of disclosure overload to be less about genuine user confusion and more about shifting the conversation away from the data stakeholders really want.

This perspective by no means ‘proves’ that all ESRS data points are useful — far from it! But it casts doubt on the sweeping claim that most of them are useless. At the very least, it suggests we should pause before accepting that narrative at face value. And — like Ball and Brown showed us — look at empirical data.

 

The next Ball & Brown moment?

Fifty-seven years ago, Ball and Brown transformed accounting research by confronting speculation with evidence. Today, we need the same for sustainability reporting. Before we dismiss many ESRS datapoints as “useless,” we should ask some obvious empirical questions:

  • Which ESRS data points are actually used, by whom, and for what purposes?
  • Which ones trigger reactions — in capital markets, with consumers and employees, in civil society?
  • Where are the mismatches between what is reported and what stakeholders truly want?

At the Sustainability Reporting Navigator, we are taking first steps in this direction.

First, we have built a public resource that provides a quick and transparent overview of EFRAG’s recent changes to the first set of ESRS, broken down to individual datapoints. Access it here.

Second, we track users’ information queries in our free, open-access ESRS reports database. Doing so allows us to see clearly which questions users ask of ESRS reports, which ESG matters and data points they focus on, and which firms they benchmark against. EFRAG recently constructed a similar ESRS information hub (about six months after ours went online). 

We sincerely hope that EFRAG will also take steps toward tracking and analyzing data usage — to build a sound evidence base for curating ESRS datapoints in the ongoing omnibus reforms. (The ‘decision tree’ used by EFRAG used to assess reporting relevance is not publicly available, and it is unclear how it encompasses user evidence.) 

The Sustainability Reporting Navigator team stands ready to assist in any way we can.

 

Until such work is done, sweeping judgments about “useless data” remain just that: judgments, not evidence.