Analytical Recklessness

The basic promise of professionals is not to neglect the responsibilities of their expertise to society, and so they have a license to organise that expertise. The alternative is that non-professionals could take on responsibilities, and act recklessly, to the detriment of everyone involved. So some practice is limited by professional credentials conferred for accreditation of a combination training and experience. For this reason professionals have high status and negligence is taken very seriously, both for its direct consequences and for the detriment to the reputation of the profession.

Professional expertise is not infallible however, and it is the basis of the profession that it adapts and enhances as practice and society develop. But the implicit expectation is that there are safeguards within training and professional practice which guard against acting beyond expertise, or recklessly bypassing necessary assurance. This is largely managed by standardised training and then mentoring of junior members so that they have seen how to use procedures appropriately, and have access to technical review and advice.

Emergencies

In emergent and high consequence situations, therefore, experienced and regarded experts are sought for their extensive knowledge and diligence. They know the limits of their own knowledge, the limits of the professional tools, and which pressure may tell. In order to compensate each of these boundaries, professionals establish networks to ask for advice, and procedures for deploying new tools, but most importantly forums to share practical challenges. Where there is a risk to reckless behaviour from professionals is when these compensations are unavailable, and a pandemic offers those.

Discounting the point that in a pandemic people may be ill without being aware and have their judgement compromised, there are other specific strains in emergencies. Capacity has a limit, and so demand for urgent action may exceed total ability to service it, as was the case in early demand for modelling the pandemic. The response of forming a group with additional resources to distribute, monitor and manage production of new modelling problems was pragmatic. This allowed for coordination as well as quality assurance in the early phases when groups had initially responded with uncoordinated and duplicative outputs.

Brokerage

The increase in activity was driven by an increase in demand for analytical services, and the increased consumption it serviced was more broadly distributed. While the risk of reckless behaviour was managed in one direction, the challenge of professionals being asked to apply themselves to domains beyond their experience was not so simple. Typically questions arise that span across expertise, so sourcing suitable evidence and experts is anticipated, but that is reliant on background appreciation and networks.

In order to answer a question it needs to be operationalised, and implicit assumptions and priorities established. Analysis as a service in an emergency needs to be able to discern what is urgent, and which will have the most important impact, both immediately and as a foundation for the future. Neither is obvious on either side because of the complexity, uncertainty and tractability of the issues involved but more and better data may be part of the solution. And if understanding of these is incomplete poor choices about analyses and their utility can be made, so analyses are brokered not delivered.

Unintended Consequences

If analysis is managed and brokered it may yet be inappropriate, not so much for errors which ought to be assured, but unwarranted assumptions or misrepresentations which need mitigation. So the professional responsibility is to evaluate the risk on each count and examine the warrant for the assumptions and design suitable presentation. Even so, where professional evaluation changes due to new information or unplanned dissemination, there is responsibility to warn that analysis is not robust to such developments.

Not to have made such evaluations is literally negligent, and tends to be limited to when possible eventualities were unknown at the time, so having made inquiries is a mitigation. But such neglect still ought to be admitted and compensated where it cannot be mitigated, so having known there are such risks and simply discounted them for expedience is reckless. Consequences which were not intended but known to be possible and undesirable ought to be evaluated for their evidence, even as this is difficult.

Misleading Presentations

Much has been made of assumptions in pandemic scenario projections, but these have concentrated on correctness of early calibration using incomplete data, not the specification of heterogeneity in models where ontological assumptions have radical impacts. But in both cases there has been commitment to resolve suitability once better data can be collected, even if sensitivity to a plausible range of calibrations is not produced. Where care has been less evident is in presentation of analyses which might mislead, especially as repeated presentations aggregate to give a perception of the whole picture.

The fact of a pattern, its shape, and its size can each be the focus of communication, and giving false certainty about any would be misleading. This can be inadvertent, in not stating implicit assumptions, or not considering some scenarios, or combinations of variables in analysis. But by thinking about the audience who might be misled, it becomes a professional’s responsibility to consider what they may know and how they will relate to the presentations, and add caveats. Not to do so would be negligent even as suitable training is limited.

Out of Domain Science

Much of science has barriers to participation, access to a laboratory for example, which is enforced by members of the relevant discipline. And when science is produced, it is also scrutinised, preventing work not meeting certain quality standards from gaining the status of approved insiders. Peer review is the typical description for this process, although it usually describes the editorial system for academic journal articles, other scientific outputs are scrutinised by scientific peers ahead of publication or approval by regulators.

An emergency suspends several of these checks, because the standard process for review is expedited or bypassed entirely. And those who might be sought to do review or maintain norms of quality and process in the discipline are committed in urgent activity. But this is where leadership in sustaining the professional standards is important, as the impact of reckless behaviour could be more damaging. This applies more in analytical work where there are not the same physical barriers to entry as for bench work.

Risk Evaluation

The distinction between professionals, that they may be negligent and should review themselves and each other, and non-professionals may be reckless and need to be policed, is not quite fair. The point is that professionals are aware of what they should do and the risks of not doing so, but non-professionals have gaps in their understanding. So professionals, particularly in urgent action, can evaluate the risk and make a judgement about how to mitigate those not thought acceptable, whether taking advice or collecting more data etc.

Thus recklessness is to proceed despite known risks, and negligence is not to have assessed them (non-professionals know, usually, that they are not professionals and bear a risk). So a professional can proceed recklessly by discounting a level of risk, whether a known chance, or a known uncertainty about impact or likelihood of an adverse event. And this could be not considering potential audiences for an analysis, and therefore how it might mislead them and how to mitigate this risk, as well as quality.

Reasonable Expectation

So what is a reasonable and proportionate expectation of professionals, to anticipate consequences, and evaluate the risk associated with them? The solution to this is sometimes framed as an ethics of care, particularly applied in the field of innovation, and so fixed on where novelty uncertainty of impacts is seen as a potential. But this is unsatisfactory for several reasons:

  • To some extent, this assumes a fixed understanding of the ontology;
  • Risk arises as the world changes, not just the impact of the humans;
  • Complex systems cause surprises for environmental reasons too;
  • Communication has political and social functions beyond analysis.

An ethics of care might lead analytical innovators to be cautious about developing and recommending novel analytical ideas, but that does not really address the risks involved. Counterfactual analysis, in terms of what potential alternative analytical scenarios may mean, is already recommended, but without radical uncertainties for complex systems. So the responsibility might be specified as considering the topology of a complex systems and seeking advice on whether regularity constraints can be violated.

Communicating Uncertainty

The advice to say what you know and the uncertainty around it, what you don’t know and what you are doing to find out, and what changes to expect to interim advice, is sound. But a lot rests on doing the first step well, i.e. knowing something and communicating that knowledge in the context of its uncertainty. And that neglects the concern of what the audience needs to know to determine their own actions, as well whether they appreciate the structure of the uncertainty involved.

It would be reckless to have discounted any aspect of knowledge, its lack, or potential action. A more difficult issue is when information is communicated to shift action from one state to another, e.g. from inaction to response, for salience rather than quantity. Some research suggests a better approach would be to change the framing to exceeding a threshold, rather than a fixed but shocking estimate. Professionals need to develop trust in what they may need to say in the future not just immediately.

Rising to the Challenge

Reckless behaviour is something all professions guard against, but the consequences in analytical work make it more problematic to prevent. Low barriers to entry, and substantial unknowns in terms of impacts, especially as analysis is often not evaluated against reality but desired policy. There is a political element to analytical work, and this means that values are always present, even as choices may be described as conventions. The highest standard of practice is to make proportionate evaluation of the choices, including omissions, to understand what is at stake and how it may have unwanted impacts.

Emergencies offer a bigger challenge, as there is urgent demand for knowledge about the future, which various approaches to analysis aim to provide. Again evaluation of risk is important, but to cover which uncertainties are important to resolve, and to plan further analyses as a programme. Data is particularly important, and as a bad workman blames his tools, perhaps a bad analyst blames his data – we must work with what we have, but also plan for the future, even in the middle of a crisis. So it is most encouraging that the government in the UK now sees data as a key component of risk planning.

One thought on “Analytical Recklessness

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.