I just came across this truly great, in-depth study by the European Environmental Agency entitled “Late lessons from early warnings: science, precaution, innovation”, EEA Report No 1/2013. Honestly, I view this as one of their most interesting reports for some while. I can really only urge everyone to take a look at that report. You can download it HERE and even obtain a free hardcopy HERE.

In that report the EEA looks at early warnings from various health hazards like lead in petrol, tabacco, or DDT; at different ecosystems; at emerging issues like radiation from nuclear incidents or mobile phones as well as at genetically-modified crops or nanotechnology.

Now, I have not had the time to go through the full report, but – and this is despite all my praise of the report – I noticed that the twelve late lessons to take away are, let’s say, somewhat too general. They are

  1. Acknowledge and respond to ignorance, uncertainty and risk
  2. provide monitoring and research in early warnings
  3. improve scientific knowledge
  4. etc…

So while these lessons are clearly useful to forward, I do not view these as particularly helpful. We are working on each of these lessons already and, in my opinion, they do not help us to see the source of the problem. The point is this: in every of the cases that the report analyzed, there was already sufficient scientific knowledge available that clearly gave some idea of the costs of what would happen if that technology would be implemented. The fact is that this knowledge was simply ignored.

My guess is that it was ignored because policy makers did not want to constrain innovation in their region/country, did not want to place the burden of proof on the companies themselves, and feared too much that international competition would take advantage of any precaution or reluctance in blindly moving in unknown territory. And this is the real danger. European companies recently even wrote a letter suggesting that the EU is restricting innovation by too much, see HERE.

We are simply not taking time anymore to fully reflect on decisions. We think that decisions need to be taken now and as quickly as possible, best yesterday, in order to keep competitiveness and innovation running. In doing so we thoroughly ignore potential impacts of any uncertain technology that may harm the biosphere for years to come. And since technology and innovation is advancing so quickly, who knows where this race will end!

This is especially more worrying since some technologies, like those that e.g. induce cancers, sometimes require years and years of thorough research in order to fully understand their true costs. Thus, one will see some individuals – company shareholders- running ahead collecting the benefits now, while a potentially larger mass of individuals may bear the brunt of the costs in the future.

Another problem is that the World Trade Organization is reluctant to accept trade barriers, even if they are based on reasons of uncertain technology or health hazards. How should countries undertake proper policy if their well-intended regulations are undermined by international laws? What if this leads to trade-wars or international conflicts only because one country is more profit-maximizing oriented than another that places sustainability at the forefront?

In my opinion, these points above are the real problems to address and the real lessons to take away. Early warnings do not work for those reasons above and they will not continue to impact technology adoption unless we take care of these issues!

By the way, a nice point of the report is this: Of 88 cases of claimed ‘false positives’, where hazards were wrongly regulated as potential risks, only four were genuine false alarms. The frequency and scale of harm from the mainly ‘false negative’ case studies indicate that shifting public policy towards avoiding harm, even at the cost of some false alarms, would seem to be worthwhile, given the asymmetrical costs of being wrong in terms of acting or not acting based on credible early warnings.