Jump to content
 







Main menu
   


Navigation  



Main page
Contents
Current events
Random article
About Wikipedia
Contact us
Donate
 




Contribute  



Help
Learn to edit
Community portal
Recent changes
Upload file
 








Search  

































Create account

Log in
 









Create account
 Log in
 




Pages for logged out editors learn more  



Contributions
Talk
 



















Contents

   



(Top)
 


1 Background  





2 Current context  





3 References  














Collingridge dilemma






Deutsch
Français
Українська

 

Edit links
 









Article
Talk
 

















Read
Edit
View history
 








Tools
   


Actions  



Read
Edit
View history
 




General  



What links here
Related changes
Upload file
Special pages
Permanent link
Page information
Cite this page
Get shortened URL
Download QR code
Wikidata item
 




Print/export  



Download as PDF
Printable version
 
















Appearance
   

 






From Wikipedia, the free encyclopedia
 


The Collingridge dilemma is a methodological quandary in which efforts to influence or control the further development of technology face a double-bind problem:

  • An information problem: impacts cannot be easily predicted until the technology is extensively developed and widely used.
  • Apower problem: control or change is difficult when the technology has become entrenched.
  • The idea was coined by David Collingridge [de] at the University of Aston Technology Policy Unit in his 1980 book The Social Control of Technology.[1] The dilemma is a basic point of reference in technology assessment debates.[2]

    Background[edit]

    In "This Explains Everything," edited by John Brockman, technology critic Evgeny Morozov explains Collingridge's idea by quoting Collingridge himself: "When change is easy, the need for it cannot be foreseen; when the need for change is apparent, change has become expensive, difficult, and time-consuming."[3]

    In "The Pacing Problem, the Collingridge Dilemma & Technological Determinism" by Adam Thierer, a senior research fellow at the Mercatus CenteratGeorge Mason University, the Collingridge dilemma is related to the "pacing problem" in technology regulation. The "pacing problem" refers to the notion that technological innovation is increasingly outpacing the ability of laws and regulations to keep up, first explained in Larry Downes' 2009 book The Laws of Disruption, in which he states that "technology changes exponentially, but social, economic, and legal systems change incrementally". In Thierer's essay, he tries to correlate these two concepts by saying that "the 'Collingridge dilemma' is simply a restatement of the pacing problem but with greater stress on the social drivers behind the pacing problem and an implicit solution to 'the problem' in the form of preemptive control of new technologies while they are still young and more manageable."[4]

    One solution to Collingridge dilemma is the "Precautionary Principle." Adam Thierer defines it as the belief that new innovations should not be embraced "until their developers can prove that they will not cause any harm to individuals, groups, specific entities, cultural norms, or various existing laws, norms, or traditions".[4] If they fail to do so, this innovation should be "prohibited, curtailed, modified, junked, or ignored".[5] This definition has been criticized by Kevin Kelly who believe such a principle is ill-defined[4] and is biased against anything new because it drastically elevates the threshold for anything innovative. According to the American philosopher Max More, the Precautionary Principle "is very good for one thing — stopping technological progress...not because it leads in bad directions, but because it leads in no direction at all."[5] But the 1992 Rio Declaration on Environment and Development defines the precautionary principle as ""Where there are threats of serious or irreversible damage, lack of full scientific certainty shall not be used as a reason for postponing cost-effective measures to prevent environmental degradation."[6] So rather than conceived as imposing no change until proof of safety is produced, this definition of the precautionary principle is meant to legitimate protective measures, attempting to avoid the desire of a technology's advocates to delay legislation until irrefutable evidence of harm can be produced.

    Collingridge's solution was not exactly the precautionary principle but rather the application of "Intelligent Trial and Error," a process by which decision making power remains decentralized, changes are manageable, technologies and infrastructures are designed to be flexible, and the overall process is oriented towards learning quickly while keeping the potential costs as low as possible.[7] Collingridge advocated ensuring that innovation occurs more incrementally so as to better match the pace of human learning and avoiding technologies whose design was antithetical to an Intelligent Trial and Error process.

    Current context[edit]

    The Collingridge Dilemma applies well to a world where Artificial Intelligence and Cloud are gaining ground and developers are consuming new technology at a rapid pace. Governing AI, Cloud or other similar exponential technology without slowing the pace of development of the technology is a big challenge, governments and organizations now face.

    References[edit]

    1. ^ The Social Control of Technology (New York: St. Martin's Press; London: Pinter) ISBN 0-312-73168-X
  • ^ Böhle, Knud (2009-08-01). "DESIDERA-TA: Nachbemerkungen zur TA'09, der 9. Österreichischen TA-Konferenz. Wien, 8. Juni 2009". TATuP: Zeitschrift für Technikfolgenabschätzung in Theorie und Praxis (in German). 18 (2): 121–125. doi:10.14512/tatup.18.2.121. ISSN 2567-8833.
  • ^ "This Explains Everything" (Harper Perennial, 2013, p.255, ISBN 0062230174)
  • ^ a b c "The Pacing Problem, the Collingridge Dilemma & Technological Determinism". Technology Liberation Front. 2018-08-16. Retrieved 2018-09-23.
  • ^ a b Kelly, Kevin (2010). What technology wants. Viking Press.
  • ^ Rio Declaration on Environment and Development. United Nations. 1992.
  • ^ The Management of Scale: Big Organizations, Big Technologies, Big Mistakes. Routledge. 1992.

  • Retrieved from "https://en.wikipedia.org/w/index.php?title=Collingridge_dilemma&oldid=1224612759"

    Categories: 
    Technology assessment
    Technological change
    Hidden category: 
    CS1 German-language sources (de)
     



    This page was last edited on 19 May 2024, at 11:40 (UTC).

    Text is available under the Creative Commons Attribution-ShareAlike License 4.0; additional terms may apply. By using this site, you agree to the Terms of Use and Privacy Policy. Wikipedia® is a registered trademark of the Wikimedia Foundation, Inc., a non-profit organization.



    Privacy policy

    About Wikipedia

    Disclaimers

    Contact Wikipedia

    Code of Conduct

    Developers

    Statistics

    Cookie statement

    Mobile view



    Wikimedia Foundation
    Powered by MediaWiki