Recently, Wiebe E. Bijker, professor of Technology & Society at the University of Maastricht, was in Barcelona to give a talk on “vulnerability in technological cultures”. This public lecture was hosted by IN3, the research insitute of the Universitat Oberta de Catalunya.
For those who could not attend to the conference, here you have the videowith his talk:
Abstract of the conference:
Prof. Wiebe Bijker will argue two key ideas about vulnerability. The first is that vulnerability is not necessarily something negative, and can even be considered a necessary condition for innovation: to be able to innovate, one has to be creative and take risks (Schumpeter 1939). It has also been argued that for a smooth functioning of technical systems it is sometimes necessary to take risks (see John Law’s (2003) argument about the train accident at Ladbroke Grove). This argument c ontrasts with commonly accepted risk management theories and practices that argue that it is important to define clear rules and protocols and make sure they are followed in order to make an organization as safe as possible. This standard risk management approach will be questioned by asking attention for the positive aspects of vulnerability and the influence of rules and rule breaking in constructing a vulnerable and at the same time resilient technological culture.
The second idea is that the vulnerability of modern societies can best be studied as a vulnerability of technological cultures. Today’s societies can be considered as tightly knit systems in which technologies are pervasive. Technologies do not merely assist in everyday lives; they are also powerful forces acting to reshape human activities and their meanings. Additionally, the high-tech character of modern societies makes them vulnerable at the same time. Such vulnerability thus is an inherent characteristic of today’s societies (Perrow 1999 (1984); Beck 1986). Sometimes this quality turns into a problem or even a disaster. During the last decades we have witnessed several high-tech related disasters. The Challenger space shuttle explosion, the Chernobyl nuclear accident, the Bhopal chemical disaster in India, and the Exxon Valdez oil spill — they all remind us that large-scale systems are vulnerable to human errors and technical malfunctions with far-reaching consequences. Risks to health, safety, freedom of choice, privacy and our environment are abounding in the world.