Monday, June 23, 2014

Blog 2: Vast Machines

Vast Machines: Thinking Globally

            Thinking Globally, a chapter in the book Vast Machines, written by Paul Edwards discusses the concept of knowledge and how we know what we know. To premise the discussion of the concept of knowledge the chapter begins by explaining the concept of climate change.  The example of climate change puts the thoughts of Edwards into context and how one uses technology and the process of data collection to discuss how one uses information to explain ideas as well as how one conveys ideas to others.  The purpose of the chapter is to explain how the world is becoming one system of information and how this information is interconnected.  Since the information is interconnected it creates an infrastructure that creates a stable foundation that the modern social world depends upon. In order to think globally, one must not only gather information but also convey the information to the public in a way that makes the information relevant and important. 
            The chapter discusses the process of knowledge formation in a sociotechnical system and answers the question of how do we know what we know.  To begin, the definition of sociotechnical system is a knowledge infrastructure comprised of robust networks of people, artifacts, and institutions that generate, share, and maintain specific knowledge about the human and natural worlds  (Edwards, 2010, p.17).  In order for information to be relevant and important, it must fit into an institution that shares and maintains specific knowledge.  For example, in the climate change context, the information is only as important as how well the information is conveyed to networks of people and institutions that share that knowledge.  In return the sociotechnical system answers the question of how do we know what we know.  The sociotechnical system describes the way in which people are convinced that things are true, useful, and consistent with other things they already know (Edwards, 2010, p. 17).  The use of the sociotechnical system ensures that people are aware of information.  It creates a system in which the information is shared and conveyed to the public in a way that makes it important and able to be conveyed.  The article suggested that the system of sharing information involves using instruments, such as a computer, to gather data.  Then in order to publish the data, one must use the Internet to convince people that the information that you are conveying is true and useful.  The sociotechnical system allows one to give the community a way to understand what you have found and convey to them what you think the information you found means (Edwards, 2010, 17).       
            The concept of “vast machine” and knowledge infrastructure is discussed in the chapter as well.  Vast machine is described as a machine that has infrastructures that are entangled into what makes the machine.  Infrastructures are basic systems and services that are reliable, standardized, and widely accessible, at least within a community (Edwards, 2010, p.8).  Infrastructures exhibit the following features: embeddedness, transparency, reach, learned as part of membership, links with conventions of practice, embodiment of standards, built on an installed base, becomes visible upon breakdown, fixed in a modular increments (Edwards, 2010, p.9).  In the 1980s, historians and sociologists of technology began studying the infrastructure phenomenon intensively (Edwards, 2010, p.9).  The researchers developed the theory of large technical systems (LTS) that apply to telephones and railroads (Edwards, 2010, p.9).  The LTS approach identified a series of common stages in infrastructure development: invention, development, technology transfer, consolidation, splintering, and decline (Edwards, 2010, p.10).  The stages of infrastructure and the theory of an infrastructure alludes to the concept that in order for information to be exchanged it must have an infrastructure that is made up of systems, many of which are technology systems.  In order to have information flow, the chapter discusses the concept of sharing information and building an infrastructure of interrelated systems where exchange of information is completed.  Infrastructures are said to be where information is produced, communicated, stored, and maintained.  
            The chapter also discusses that the basis of scientific knowledge and how that scientific knowledge depends on a few things.  In order to create and maintain scientific knowledge, one needs the following things: enduring communities with shared standards, enduring organizations, mathematics, conventions and laws, theories, physical facilities, and support staff (Edwards, 2010, p. 17).  If scientific knowledge is to be distributed to individuals, then someone must not only create the information but the information needs to be able to be exchanged.  In order for the information to be exchanged, it must have a community to belong to in an organization that engulfs that community.  Scientific knowledge is special in its own aspect because it has its own specialized vocabulary and laws that apply to the information.  
            Finally, the chapter discusses the concept of globalist information.  The chapter began by stating that President Johnson sent out a picture of the Earth to all of the world leaders to show how fragile the world was as well as to show how interconnected everyone is in the world (Edwards, 2010, p.1).  The concept of globalist information arises from this action.  The concept refers to systems and institutions for transmitting information about the world as a whole (Edwards, 2010, p.23).  This concept builds on the fact that the world is interconnected and the share of information is important for the world to function.  The chapter states that the best globalist system for exchanging information is in the meteorology department (Edwards, 2010, p.24).  The chapter discusses that in order for the world to learn more about climate change, everyone needs to exchange information on the subject. 
            Thinking Globally suggests, through the climate change context, there is a growing amount of information that is being produced.  The article discusses how information is exchanged, how people perceive information, and how there is a call for information exchange to better serve the public.  Considering the plethora of information that is being produced on a daily basis throughout the scientific community as well as other data producing fields, there is a need for information sharing in the world.  The chapter explains that information is important and can be organized into infrastructure which in turn can be used by everyone to make the world a better place by having an idea about what is going on in the world.


Edwards, Paul N. (2004). Thinking Globally. A Vast Machine: Computer Models, Climate Data, and the Politics of Global Warming, p.1-25.  

Monday, June 9, 2014

Blog 1: The Stupidity of Computers

THE STUPIDITY OF COMPUTERS

         “The Stupidity of Computers” written by David Auerbach provides several implications regarding how computers operate and the rise of ontologies.  The article provides details about the way in which computers cannot understand the English language, how difficult of a time computers have in thinking for themselves, and their search engines categorize information.  (Auerbach, 2012, pg. 1).  In recent decades, as the use of computers has increased, this article lays out a premise that describes computers being able to function if given the right instructions but unable to function on their own.

            The article begins by stating that computers are dumb even though they have cauldrons of processing power.  Even though computers have millions of pieces of information at their fingertips, they cannot access such information unless the user looking for the information inputs the proper search words. After this introduction, the article goes on to discuss the different ways in which computers can access information if the right words are used.  Then if the right words are used, then it is a matter of what words the search engine picks out as the most important.  Many researchers have attempted to come up with a way in which computers can understand the English language to make computers more effective and smart, especially when it comes to search engines; however, this has yet to happen. (Auerbach, 2012, pg. 3).  Although computers and search engines still have yet to learn the English langue as well as the implications of certain words, search engines such as Google has made searching for topics better, without learning the English language.

            Google inventors, Sergey Brin and Larry Page, came up with a way to link searches up with websites that are most popular among other websites. (Auerbach, 2012 pg. 5).  Instead of counting words on a website to determine if it fits a search, Google has found a way to bring up the most popular site on the Internet. (Auerbach, 2012, pg. 5).  However, computers still have not learned the English language or the implications of the English language.  Now websites are beginning to use ontologies, which is a conceptual framework for a number of kinds of entities as well as any number of relationship between them. (Auerbach, 2012, pg. 4).  Amazon has taken the ontologies and used them to suggest to customers other items to purchase.  However, the issue with ontologies is that they are people made and computers are programmed to use them.  The issue with ontologies expands to Facebook, Twitter, and Wikipedia.

            The purpose of the article is to discuss how computers cannot think for themselves, which makes them stupid.  The article lays out how humans have made computers smart but when it comes to computers functioning on their own it is not acquirable, at this moment in time.   The purpose is to show the ways popular websites on the Internet function and how the computer did not make this happen but how the people programed the computer to make the function the website provides exist. 

            The importance of the article is to inform the general public that computers are not as smart as one may think.  Although computers have tremendous processing power, storage for data, and access to many functions, the computers have not done this themselves.  Humans are responsible for the ways computers operate.  The article also stresses the importance of how computers cannot understand the English language and must use categories set up by humans to understand. 

            The article’s implications are huge.  They suggest that computers are dumb because they cannot work on their own and cannot understand the English language.  The article also implicates the ways in which a computer is useless without the help of a human.  The article suggests that computers will become more accessible to the general public but the general public must dumb themselves down and put themselves into categories for computers to work.  The article suggests that computers will not change in the way that they work but people will have to change the way they act around computers and how to use them properly. 

            Around the Internet there are many ways in which David Auerbach’s thoughts and implications are exemplified.  For example, when one searches for a certain subject matter, such as a popular TV show, the computer does not understand the TV show but can link you to the TV show’s website.  (www.google.com).  On Amazon’s website, when one searches for a certain textbook, Amazon suggests study aides to go along with those textbooks.  (www.amazon.com).  This is another example of how the computer is not thinking of those study aides but a way the people in charge of categorizing items have linked the computer to show items that can be offered.  Many websites on the Internet, especially shopping websites, use this mechanism.  However the computer is not suggesting this to you but the people who have programmed the computer to offer such items of clothing are offering certain items to you.

            Although this article operates on the premise of computers being stupid, the categorization of intelligence should not be used when speaking of computers.  Considering computers are machines, they should not be described as being dumb or smart because those words deal with intelligence, in which computers do not possess.  However, computers are not dumb.  They are machines that can be used in a variety of ways.  Although they cannot think on their own, they aid people in their everyday lives.  Therefore, computers are not stupid per say but they are not as smart as human beings.    



Auerbach, David. The Stupidity of Computers, Machine Politics. Issue 13, Winter 2012.