CHAPTER TWO: HISTORICAL OVERVIEW OF LEARNING AND TECHNOLOGY
We propose that the crucial difference between human cognition and that of other species is the ability to participate with others in collaborative activities with shared goals and intentions.
—Michael Tomasello et al, 2005
Chapter Two covers the following topics:
- Introduction to history of learning and technology
- Steps in human development
- The invention of the Internet as a Meeting of Minds
- The Web and its social applications
- Historical overview of online learning
- Adjunct Mode Online Learning
- Blended or Mixed-mode Online Learning
- Totally Online Learning
Chapter Two explores the fascinating story of how learning and technology have been integral to human development from our earliest human ancestry. Technology has enabled communication and, linked with our most human characteristic of intentional collaboration, is essential to human learning and development. Chapter Two explores the role of learning and technology, focusing on specific historical developments that revolutionized our communication and expanded our knowledge-building capacities, from the time of our pre-linguistic and prehistoric ancestors until the present Knowledge Age.
FIGURE 2.1 Four Communication Paradigms
Steps in Human Development: Learning and Technology
Our human ancestors, whether hunters and gatherers eking out survival with family and clans in caves, or members of ancient civilizations who built city states and engaged in commerce, were profoundly different in many ways from today’s societies. Nonetheless we all share the need to survive and advance: learning, communication, collaboration and the creation of tools are the fundamental mechanisms that enable human society to survive and progress….Chapter Two provides a basis for this discussion by providing an overview of how learning and technology have been interconnected throughout human history and are key to social and civilizational advancement.
The need and ability to learn (and hence to educate effectively and efficiently) is at the root of human survival and civilization. Since prehistoric kinships, humans have addressed the need to survive and thrive through teaching and learning their young and one another by inventing new learning technologies.
And we have done so collaboratively and collectively. In fact, evolutionary biologists today propose that the dividing line between humans and other species is the ability to intentionally participate in collaborative activities. Traits that anthropologists once believed separated humans from other great apes, such as tool-making, walking, hunting co-operatively, and fighting wars, have all been found to exist in other species. Sarah Hrdy (2009), a renowned evolutionary anthropologist, writes that it is intentional collaboration, along with our extra large brains (relative to our body size and compared to other species) and capacity for language that marks the dividing line for human behavior, separating our nature from that of other apes (p. 9). Michael Tomasello, leader of the Max Planck Institute of Evolutionary Anthropology, writes that, “human beings, and only human beings, are biologically adapted for participating in collaborative activities involving shared goals and socially coordinated action plans” (as cited in Hrdy, 2009, p.9). Collaboration is the key to our survival and to cultural and human development and knowledge.
Hrdy goes on to explore collaboration as the basis for human development. “Unlike chimpanzees and other apes, almost all humans are naturally eager to collaborate with others. They may prefer engaging with familiar kin but they also easily coordinate with nonkin, even strangers. Given opportunities, humans develop these proclivities into complex enterprises such as collaboratively tracking and hunting prey, processing food, playing cooperative games, building shelters, or designing spacecraft that reach the moon” (2009, p.10). Collaboration is a key characteristic of human development, reflected in all our survival and civilizational activities from raising our young to collaboratively gathering food to building spacecraft. The major stages in human development are referred to as paradigmatic shifts: major changes in society, learning, technology and knowledge.
Human social development is the result of key civilizational shifts throughout history. These civilizational shifts (also known as paradigmatic shifts) refer to the major transformations that occurred as technological breakthroughs came together with changing cultural, social and economic conditions to create new contexts, opportunities and challenges.
In both prehistoric and historic periods, technology breakthroughs and new social formations have combined, each influencing the other and thus establishing new lifestyles that, in turn, impacted on each successive generation and society. They are turning points, milestones in human development. Scientists generally identify four major paradigmatic shifts, although the names for these shifts may vary.
A general and condensed chronology of these major socio-technological shifts includes:
- Speech (40,000 B.C): the development of speech and intertribal communication in hunter-gather communities produces recognizable civilizations based on informal learning with characteristic crafts and symbolic art;
- Writing (10,000 B.C): agricultural revolution interacts with the massing of populations in fertile regions to produce state structures and cumulative knowledge growth based on the invention of writing and formalization of learning;
- Printing (1600 A.D): machine technology and the printing press interact with the development of global trade and communication, to expand the dissemination and the specialization of knowledge and science;
- Internet (2000 A.D): advanced network technology interacts with powerful new models of education and training that offer the potential to produce knowledge-based economies and the democratization of knowledge production.
In the 21st century, educators ponder current practice, new technologies, and how to address the gap between the two. Scenarios for new learning technologies and practices are explored in Chapters Six to Nine. Yet, the history of how we navigated our way to the present is also important. This history tells the story of how human learning has been linked with technology, communication and collaboration. This history is important to our understanding of learning in general and also to frame our study of learning theory in the 20th and 21st centuries.
The next section provides a brief synopsis of this history, exploring the paradigmatic shifts representing major leaps in learning and technology:
FIGURE 2.2 Technological Milestones within Four Communication Paradigms
Each shift represents an advance to a new level of knowledge.
- 40,000 B.C.: the development of speech and intertribal communication in hunter-gatherer clans produces recognizable civilizations with characteristic crafts and symbolic art.
As with children today, the newborn in hunter-gatherer communities began to learn within the context of the mother and the surrounding clan and community. Since earliest prehistory, humans learned from observing and imitating the behavior of others.
Our prehistoric ancestors also developed new technologies to assist in personal and communal survival, in this case communication technologies using the human voice. Speech evolved from grunts, shouts, noises and whistles intended to signal an event or emotion; for example, distress, warning, threat, need, pleasure and pain. Prehistoric speech and language were forms of codified communication: what is good or bad; what to do or not do; who should do it, and when, how and where to do it; and, eventually, why to do it, either for cultural or survival purposes. While this period is often characterized as the Stone Age and it is true that advanced tool making based on stone is a key characteristic, most importantly this is the age of speech—the most profound technology that mankind has invented.
In prehistoric societies, children learned both by observation and mimicry, as well as from the ‘technology’ of oral education provided by their mothers and the clan. Speech also meant that the communal history of knowledge, beliefs, culture and skills could be passed from one generation to the next. This early stage in the technology of language enabled ‘oral histories’ to pass from one person to another, generation to generation, through stories, legends, rituals and songs. Wall drawings were invented to illustrate or instruct, and enhance oral traditions. Language and illustration were important tools for sharing, archiving and transmitting information and knowledge.
10,000 B.C.: the agricultural revolution interacts with the massing of populations in fertile regions to produce state structures and cumulative knowledge growth based on the invention of writing and the formalization of learning.
The term Agrarian revolution refers to the transition from bands or communities of hunters and gatherers that characterized our earliest ancestors to that of an agriculture-based economy and society. Whereas hunters and gathers were constantly on the move to track herds of animals and adapt to the seasons to harvest grain, fruit, nuts and roots, the development of agriculture made human settlement (or semisettlement) possible. The domestication of plants and animals enabled a more stable lifestyle and settled society, as compared to living hand-to-mouth to survive, which required far greater expenditure of time and energy.
The process of producing and harvesting food, the increased yield of crops, and the reliability of access to such essentials had immense social and economic impact on these communities. With more stable living conditions, these early communities had the time and energy to learn new skills: they became more proficient in trading and the structure of their communities became more complex; they established trading economies, privatization, as well as social, economic and political stratification. All of this contributed to a continuous development of culture, knowledge and new technologies.
The technology of writing—numeracy and literacy—evolved during this period. Numeracy (a system of counting and recording numbers) and literacy (a system of writing letters and/or words) developed as a result of the surplus of food and goods derived from domestic production and trade. Storage and trade of goods required a form of recording, and writing solved the need for ways to count and to describe items held, received or distributed, as well as designating ownership. Literacy was, at its most basic, a method for record keeping. It is believed that characters used for communication emerged approximately 3500 B.C. in various agrarian civilizations and are linked to the development of surplus yields and private ownership. The original Mesopotamian writing system was derived from a method of account keeping and by the end of the fourth millennium B.C. had evolved into a triangular-shaped stylus pressed into soft clay for recording numbers. Around the twenty-sixth century B.C. cuneiform began to represent syllables of spoken Sumerian and became a general purpose writing system for logograms, syllables, and numbers. The world’s oldest alphabet was developed in central Egypt around 2000 B.C. from a hieroglyphic prototype (Martin, 1994).
The increased organization and specialization of society required changes in how and what people learned. The majority of the population continued to learn through mimicry and apprenticeship: this included observation, hands-on training, and experience by trial and error. Eventually, however, formalized learning emerged among settled populations.
Formalized learning was ‘invented’ as a way to teach a select group of people who had been chosen to serve in matters of importance, such as tasks related to money or religion. These early societies required workers who possessed literary and numerical skills to guarantee accuracy and accountability. Instructors ensured that their curriculums were recorded, maintained, updated and that the learning outcomes were assessed. Such formalized education eventually became the basis of schooling as we know it today.
Formal learning in these early societies focused on the skills of writing, reading and counting, but also on civil behavior as appropriate to the students’ socio-economic standing. Formal education was based on exclusivity. Only people from privileged backgrounds were allowed to learn the skills to become scribes or officials for political, religious, economic or military service. Through prescribed learning, these people were socialized to be upstanding citizens and followers of the faith.
In 580 B.C., Xenophon, popularly known as the first historian, wrote about learning and Persian laws. He explained that in Persia men are educated to avoid lawless behavior and that formal education served a preventative purpose. In Persian society, special areas of the royal court were set aside for learning. This was a very early form of school.
Formal learning is also traceable back to the Greek philosopher, Plato (427-347 B.C.), who founded the Academy in Athens, regarded as the first institute of higher learning in the Western world.
The history of writing is fascinating and of profound interest to understanding communication and human progress of it throughout the ages and to the present. According to H.J. Martin (1994), “All writing is tied to the form of thought of the civilization that created it and to which its destiny is linked” (p. 15). The history of writing is beyond the scope of this book, however, except to provide the context and framework of the integral links between writing, language, thought and knowledge. Writing enabled knowledge to be communicated. It could be disseminated to others, near and far, and hence not only transmit ideas but contribute to ongoing discussion, debate, and knowledge building. Writing also enabled knowledge to be archived and hence disseminated historically: future generations could read the prevailing thoughts and ideas, and thereby learn from and add to the cumulative body of human knowledge. Writing is the basis of formal learning.
The first known writing is believed to have developed around 2300 B.C. in Mesopotamia. Writing spread to Crete during the period around 2000 B.C., and by the ninth or eighth century B.C., the Greek alphabet—the ancestor of modern Western alphabets—appeared (Martin, 1994, p. 34). In all of its varieties and instantiations, whether in the ancient Middle East, China or Pre-Columbian America, writing emerged and was revered as communication with the deities.
Similarly, Roman Papal doctrines were thought to ‘hold the word of God’; they were considered divine and sacred and therefore not to be seen by or communicated directly to the common man or woman, but were represented by the church and its few literate priests. Furthermore, all sacred documents were written in Latin, a language not understood by the common person. Reading and writing were skills reserved for a very select few.
However, the power of the written word—four thousand years after its invention—was about to be unleashed. In the 15th century in the Western world, the printing press (and related technologies such as paper) was being invented.
III. Printing (and Mass Communication)
Arguably, the most famous ‘learning technology’ of the third paradigm was the invention of printing. Johannes Gutenberg (1398-1468), a German printer and goldsmith, invented movable type and the mechanical printing press around 1439. He is also known for printing the Gutenberg Bible: approximately 180 copies were published in 1455.
The invention of the printing press was a technological innovation with tremendous implications for Western society, in that it provided a means for disseminating ideas about religion, but also science, education and politics. The printing press enabled books, such as the Bible and others, to be printed in larger numbers and for less cost than handwritten, manuscript versions previously available only to the Church and elite. For the first time in history, commercial mass production of books was possible. Printing made books more economical to produce and wider segments of the population could afford them. Publishing allowed people to follow debates, take part in discussions, and learn about matters that concerned them. One early example is pamphlets on the Plague taught people how to deal with this illness.
Gutenberg’s printing press revolutionized learning and knowledge transmission in Europe to an unprecedented degree: pamphlets, booklets and complete books could now be efficiently and cost-effectively produced and disseminated.
Printing spread widely and rapidly across Europe and by the end of the 15th century, the number of books produced on presses like that designed by Gutenberg reached the hundreds. The rapid spread of publishing was a major factor contributing to the Renaissance, the Scientific Revolution and the Protestant Reformation. Martin Luther’s 95 Theses was nailed to the doors of the Castle Church in Wittenberg, Germany in 1517 (though this claim is debated) and was subsequently printed and widely circulated. The production of more books and the propagation of ideas to a wider audience fueled new ways of understanding, as well as influencing a significant shift in Western thought. Importantly, the broadsheet format of Luther’s 95 Theses and its circulation became a prototype for newspapers and mass media today.
The production of printed books and other reading materials was motivation for the public to learn to read and seek formal education. The availability of reading materials meant more people did learn to read and expand their knowledge on a wide range of topics.
The momentum towards public access of information and knowledge was unstoppable once books became more widely available. By 1465, the printing press in Europe led to the rapid growth of printed materials and the dissemination of information to an eager public.
There were more than 250 centers of the print trade by 1 January 1501, the fatal moment after books, now out of their cradle, are no longer called incunabula. The estimated 27,000 known publications certainly represent more than ten million copies, circulated in less than two generations in a Europe whose population was under a hundred million. This would give a maximum of some few hundred thousand confirmed readers. (Martin, 1994, p. 227)
The relationship between learning and technology is again illuminated: the base of knowledge expanded with the development of speech, then writing is now further advanced as publishing creates and responds to new learning needs. The rise of machine manufacturing and industrialization in the 18th, 19th and 20th centuries is integrally linked with a need for mass literacy techniques and technologies. Mass communications intensifies the need for mass education. With the rise of modern science, new theories of learning emerged in the 20th century to address the industrial age.
2000 A.D.: advanced information technology interacts with powerful new models of education and training that offer the potential to produce knowledge-based economies and the democratization of knowledge production.
The invention of computer networks in the late 1960s and computer-mediated communication (CMC) in the early 1970s initiated a shift in how we understand our most basic concepts of education, community and society. Our sense of who we are as citizens in the world, how we meet and collaborate with others, and how we learn and contribute to social development was transformed by the telecommunications revolution of the mid-19th century (telegraph, 1861; telephone, 1876) and early 20th century (television, 1925; satellite technology, 1957) and more recently and profoundly with the Internet Revolution in the mid-20th century. The developments associated with the Internet, the World Wide Web (known as the Web) and other online technological inventions have introduced profound implications for learning theory and practice.
A look at how quickly and widely computers and the Internet have impacted on work and society worldwide is astounding: Arpanet was invented as recently as 1969, email over packet-switched networks was invented in 1971, computer conferencing/forums were invented in 1972, the public Internet was launched in 1989, and the World Wide Web was invented in 1990 and released to the public in 1993.
It is important to recall that up to the 19th century, communication was almost entirely restricted to one’s locality. The first public trans-Atlantic telegraph was sent by Queen Victoria, in 1857. Until then, technologies for communicating at a distance were more or less similar to that of 5000 years earlier: that is, messages were carried by courier on foot, on beast or by boat. Distance communication was controlled by those with power (royal, military or religious leaders). Throughout history, communication among common people was limited to local, face-to-face conversation. Or the use of ‘distance’ technologies—talking drums, smoke signals, carrier pigeons and semaphore. Generally these modes of communication were only employed in times of distress. Otherwise information traveled slowly. Even with the introduction of the printing press, important new ideas took years to disseminate from city to city, country to country or between Europe and the New World of the Americas. Until relatively recently, the spread of knowledge was limited.
In the 20th century the invention and adoption of the Internet introduced a great leap forward in communication, both quantitatively and qualitatively. The Internet represents a worldwide knowledge transformation on a global scale.
The invention of computer networking technologies has roots in a vision of concern for collaboration, community, learning and knowledge. One of the earliest technological precursors is hypertext, a concept and technology, important as the precursor and inspiration for the World Wide Web.
The history of hypertext began in 1945 with Vannevar Bush’s article in The Atlantic Monthly entitled “As We May Think,” about a futuristic technology that he called Memex, “a device in which an individual stores all his books, records, and communications, and which is mechanized so that it may be consulted with exceeding speed and flexibility. It is an enlarged intimate supplement to his memory” (p. 108).
Bush’s groundbreaking vision of a technology to enhance thought predated the computer. Nonetheless, Bush’s article and his concept of the Memex directly influenced and inspired the two Americans generally credited with the invention of hypertext—Ted Nelson and Douglas Engelbart.
Nelson coined the words ‘hypertext’ and ‘hypermedia’ in 1965 and worked to develop a computer system that enabled writing and reading that was nonsequential and presented the potential for cross-referencing and annotating (Nelson, 1974). In Project Xanadu, Nelson sought to create a computer networking system that enabled users to view hypertext libraries, create and manipulate text and graphics, send and receive messages, and structure information. Such a system allowed users to create linkages among ideas and information resources, to explore the interconnections, and generate multiple perspectives on a topic (Nelson, 1987). This vision predates but anticipates the Web.
Douglas Engelbart, as with Vannevar Bush two decades earlier, was concerned with enhancing the intellectual capacity of people. In 1962, Engelbart published his seminal work, “Augmenting Human Intellect: a conceptual framework,” proposing to use computers to augment training. With his colleagues at the Stanford Research Institute, Engelbart developed a computer system to augment human abilities, including learning. The system was simply called the oNLine System (NLS) and debuted in 1968 and later marketed as ‘Augment.’ One of the most notable design features of Augment is the emphasis on providing tools to support collaborative knowledge work. The Augment project “placed the greatest emphasis on collaboration among people doing their work in an asynchronous, geographically distributed manner” (Engelbart & Lehtman, 1988, p. 245). Augment enabled idea structuring, as well as idea sharing. While linkages among ideas and authors are supported by Augment, the system employs a hierarchical structure. Xanadu and Augment “…were the first systems to articulate the potential of computers to create cognitive and social connectivity: webs of connected information and communication among knowledge workers” (Harasim, 1990a, p. 41).
The initial concept of a global information network came from J.C.R. Licklider in the late 1950s. At a time when computers were viewed as giant calculators, Licklider envisioned the use of networked computers to facilitate an online community, online personal communication, and active informed participation in government (Hafner & Lyon, 1996, p. 34). Licklider’s 1950’s visions were prescient and one of the earliest precursors to the rise of personal computers and computer networking.
In 1960, Licklider published his seminal paper “Man-Computer Symbiosis” in which Licklider proposes the potential of computers to transform society. Licklider put forward a vision that anticipated collaborative learning, emphasizing the potential of the computer to support group discussion, networking, multiple perspectives, active participation and community practice. Although Licklider left the Arpanet project before it was completed, his vision of Arpanet as a knowledge network remained. The actual technological development of Arpanet was the work of Lawrence G. Roberts of Massachusetts Institute of Technology (MIT).
Another very important technological development related to human communication and collaboration was computer conferencing. Computer conferencing was invented to support group communication and decision-making, and the first system, EMISARI, was developed by Murray Turoff in 1971. In 1974 Turoff founded the Computerized Conferencing and Communications Center at New Jersey Institute of Technology (NJIT) and developed the EIES computer conferencing system. Other conferencing systems developed in the early-mid 1970s were PLANET, Confer and, *Forum. Computer conferencing is important to the history of online education, because many of the earliest ventures in online course delivery involved computer conferencing. Over the next thirty-five years and to the present, Turoff engaged in research and development on computer-mediated communication (CMC) with Starr Roxanne Hiltz. Much of their work was and remains directly related to education and one of the most important outcomes was the development and implementation of the ‘Virtual Classroom,’ which pioneered the first total delivery of undergraduate education in the world. It was also the first major scientific field trial of online education and as such, provided an important empirical base for others in the field (Hiltz, 1994).
In 1990, Tim Berners-Lee, a scientist at CERN (European Organization for Nuclear Research), invented the World Wide Web to meet the demand for information sharing among scientists working in different universities and institutes around the world. In 1992, Lynx was developed as an early Internet Web browser. Its ability to provide hypertext links within documents that could reach into other documents anywhere on the Internet is responsible for the creation of the Web, which was released to the public in 1993.
Arpanet and Internet: Meeting of Minds
The origins of the first computer network, Arpanet, are linked to a vision of human collaboration and community. While the term meeting of minds was not actually used, this concept suggests a powerful metaphor to help understand computer networking. At one level, “meeting of the minds” (also referred to as mutual assent or consensus ad idem) is a phrase in contract law used to describe the intentions of the parties forming the contract. In particular, it refers to the situation where there is a common understanding in the formation of the contract, and Arpanet was essentially that, a mutual agreement to build a network but as networks had not yet been invented, it was a commitment to an intention.
Moreover, computer networks would represent a meeting of minds in both the social and technological aspects. The inventors of Arpanet employed social terms to characterize new tools and technologies. The basic formulation of Arpanet was based on cooperation and negotiation: the network host-to-host communications became facilitated by a ‘handshake,’ using the social term to describe a key technical concept of how the most elemental connections between two computers are handled. The term ‘protocol’ was adopted from the ancient Greek protokollon, the top of a papyrus scroll that contained the synopsis of the document, its authentication and date (Hafner & Lyon, 1996, p. 144). Protocols also reflect the etiquette of diplomacy, consensus and collective agreement. Network protocols became the technical and social glue of connectivity. A network protocol refers to the address of a packet of information, but, as one of the notable architects of the Internet, Vint Cerf, noted, social protocol also refers to informal consensus. “The other definition of protocol is that it’s a handwritten agreement between parties, typically worked out on the back of a lunch bag, which describes pretty accurately how most of the protocol designs were done” (as cited in Hafner & Lyon, 1996, p. 146).
Network technology was socially and technically constructed by an informal group, the Network Working Group (NWG), who worked together in a collaborative and consensual manner. New ideas were sent out to group members and sites as notes called “Request for Comments” (these RFCs were sent via regular post: email had not yet been invented). A spirit of community, openness and collaborative design was invoked. As Hafner and Lyon remark: “For years afterward (and to this day) RFCs have been the principal means of open expression in the computer networking community, the accepted ways of recommending, reviewing, and adopting new technical standards” (Hafner & Lyons, 1996, p. 145).
Finally, Arpanet represented a meeting of minds not only in the technological design and social construction of computer networking, but also in its applications. Email, computer conferencing, forums, the Internet, virtual communities, online collaborative learning, and online collaborative work were products of computer networking and each in their own way, articulations of a meeting of minds.
Email is the first and most successful social software that has ever been invented. Within four decades, the penetration of the Internet and email in 2010 was twenty-five percent of the world’s population or 1.8 billion people.
The World Wide Web (the Web) was invented by Tim Berners-Lee in 1990 as a group work environment to facilitate online collaboration among his fellow scientists at CERN. Based on the concept of hypertext, the project was aimed at facilitating information sharing among researchers. The Web was originally conceived and developed to meet the demand for information sharing between scientists working in different universities and institutes all over the world. CERN is an organization, but it is not a single laboratory; rather, CERN is a focus organization for an extensive community that includes over 8,000 scientists and sixty countries. Although these scientists typically spend time on the CERN site, they usually work at universities and national laboratories in their home countries. Access to online communication was therefore essential to create and maintain the place-independent community.
The basic idea of the Web was to merge the technologies of personal computers, computer networking and hypertext into a powerful and easy to use global information system. Berners-Lee developed the protocols underpinning the Web in 1990. The first Website went online in 1991. On April 30, 1993, CERN announced that the Web would be free to the public, to enhance interdisciplinary, international and inter-institutional discourse.
The rate of public adoption of the Web has been astronomical and the implications transformational. Within a few months of its public appearance, the Web was adopted worldwide as a means of facilitating ease of access to the Internet and enabling vaster graphic capabilities. Within fifteen years, the Web accumulated one billion users. By 2010, the Web had two billion users. The Web thus became central to public access to the Internet and also enabled the creation of a global knowledge network.
The rise of the Web was a major catalyst in public use of online technologies: it made access to the Internet easy; it also made the production of online graphics accessible to basic users, making the Web a hospitable and valuable communication space. The Web helped to popularize the term ‘online.’ ‘Online’ was no longer a remote or obscure territory: even the next-door neighbors were ‘online.’ Communication activities such as email, forums and texting came to expand or replace postal mail, telephone calls and memos.
Having an email identity and online presence is today not only common but expected. An online presence is both a social and an economic phenomenon. We use it increasingly for social communication and work activities. In the early 21st century, the Web underwent a technological maturation and a shift that emphasized social interaction and new interactive tools. Whereas the original Web was based on static Web pages, Web 2.0 focused on dynamic sharable content.
Web 2.0 (the Collaboration Web)
Web 2.0 has come to be associated with, even defined as, the social web, or the collaborative web. While social communication, interaction and collaboration, as well as user-generated content characterized learning networks, online education and virtual communities in the pre-Web decades of Arpanet and the Internet, the emphasis of Web 2.0 was on new or better tools for social interaction, community, collaboration and content construction. Web 2.0 marks an evolution in the tools available to create and support online communities, as well as new developments such as social networking sites, wikis, blogs and communities based on sharing of social objects, such as photos, videos, music, products, encyclopedia topics and classified ads.
FIGURE 2.3 Web 2.0 : The Collaboration Web
The original social software of email and group forums remain major activities on the Web, but it is the invention and adoption of social networks that marks the keystone for Web 2.0. Online social networks, renowned for social discourse and relationship building, were first launched in 2004 with MySpace, Friendster and Facebook. They have become the major online application. By 2011, Facebook had around 750 million unique visitors….
The term ‘blog’ derives from weblog, which refers to a personal journal or diary that is available on the Web. The person who maintains and updates the blog is called a blogger. The term originated as a web site devoted to a chronological publication of personal thoughts with associated web links, with the postings organized according to the most recent entry. Blog technology enabled the organization of text postings, images and hypertextual linkages. Blogs gained popularity during the 2004 US elections when they were used to report on or discuss political events. Blogs are written in a conversational manner, and a blog today will include comments from readers of the blog that can give rise to a discussion. Nonetheless, a blog is not intended to be a group discusssion forum. Blogs were not developed to support social discourse and do not provide technological support for group discussions that evolve and deepen over time, unlike threaded discussion forums or computer conferences.
Web 2.0 is characterized by social networks that are built around the sharing and discussion of particular social objects. Social networks such as Facebook are built principally around posting of messages, while other social networks have emerged based on sharing of photos, videos, or other products or media. Many of these networks are associated with the concept of user-generated content because the members create and post content that is public and can be shared with anyone on the Web.
Examples of social networks that have formed around social objects include:
- Flickr: group discussion related to posting and sharing of photos
- Amazon: group discussion related to posting and sharing of products
- YouTube: group discussion related to posting and sharing of videos
- Wikipedia: group discussion related to posting and sharing of encyclopedia topics
A search engine is a computer program that searches and retrieves files or information from a computer database or computer network. A web search engine is a computer program or tool to search the entire Web. Due to the vast quantity of information available on the Web, search engines have become an essential feature and tool for ‘surfing’. Google.com, for example is not only the leading web search engine, but the most visited website in the world: it was registered on September 15, 1997. By 2010 the site had received well over 10 billion hits….
The Web has also introduced remarkable opportunities for transforming teaching and learning and advancing online learning. However, the history of online learning began long before the Web; it was one of the earliest applications of the net. Within a few years after the invention of Arpanet, the beginnings of online education took shape.
Historical Overview of Online Learning
Online learning (or online education) refers to the use of online communication networks for educational applications; such as, course delivery and support of educational projects, research, access to resources and group collaboration. Online learning is mediated by the Web. The implications of the Internet and Web technologies for education are still unfolding—providing new experiences to generate understanding of how to benefit from and improve learning online. The need to understand how this major technological revolution is influencing education and transforming our discipline is critical, from the smallest to the most dramatic changes.
TABLE 2.1 Brief History of Online Learning
The earliest form of online education was invented in the mid-1970s by academics who were also engaged as Arpanet researchers. These were academics working on Arpanet developments, and introduced the innovations they were encountering as topics in their university courses, thereby introducing students to email (then known as electronic mail) and computer conferencing as course content. Educational experimentation and student interest in these new communication technologies ignited exploration, and as a result, computer-mediated communication (CMC) became not only course content but pedagogical process. Students began to use email to send questions to their professors and comments to one another, while faculty explored applications of email and computer conferencing for providing additional information to students, clarifying questions, and expanding opportunities for time- and place-independent group discussion in their courses.
Soon educators from a wider set of disciplines within universities and eventually from the school system began to experiment with educational CMC, and the ‘adjunct’ or enhanced mode of online learning was born.
Adjunct Mode Online Learning
The adjunct or enhanced mode of online education refers to the use of network communication to enhance traditional face-to-face or (f2f) distance education. In adjunct mode, the use of the Internet is an add-on that complements the existing curriculum. The online activities do not replace the traditional techniques nor do they represent a significant portion of the course grade. They are used to enhance the class activities. Examples of this pedagogical approach include the use of email to contact a professor or submit assignments, the distribution of course material by the instructor, as well as the administering of quizzes or distribution of course grades. Adjunct mode also involves student use of the Internet to search for course resources and undertake course-related research. It provides a new approach for extending group discussion: the use of computer conferences or forums enables the continuation of discussion initiated in class or the inclusion of guest experts or peers from other locations. Originating in the 1970s, adjunct mode was the first major educational application on the Internet. Today, adjunct mode is ubiquitous in the use of the Internet for learning throughout the world.
Blended Mode Learning
By the early 1980s, new online educational applications emerged, expanding adjunct mode into ‘mixed-mode’ or ‘blended mode,’ in which a significant portion of the traditional face-to-face classroom or distance education course was conducted online (Harasim, 2006a). Typically about fifty percent of course activities and of the overall grade is based on online activities in blended mode. Today the term blended mode is used in many ways: it typically refers to a mix of face-to-face and online course activities. However, blended learning can also be used to describe a pedagogical mix of distance education or courseware applications with online collaborative activities such as group discussions, seminars, debates, research or group projects. Blending may also be institutional, as in the case of a degree program offered by two or more institutions, or instructional, to refer to a course with different teachers.
Totally Online Learning
The earliest totally online courses were developed and offered in the mid-1980s at post-secondary levels. The courses were based on online collaborative learning approaches such as seminars and group discussion (Harasim et al., 1995; Mason & Kaye, 1989; Harasim, 1990b; Harasim, 2006a).
As educators and researchers adopted this new domain in their work, they also wrote about it and presented their experiences to scholarly and professional venues; interest in online learning was generated and the field began to grow. However, in its early manifestations in the 1980s, it remained limited to a relatively small group of early advocates.
Most of the early online learning pioneers came from the face-to-face classroom context. The earliest users and adopters emphasized pedagogies involving student collaboration, interaction and knowledge building. In the decade before the public launch of the Internet and the World Wide Web, distance education did not identify with online education, nor were courseware providers able to easily offer their individualized multimedia pedagogy online. The collaborative learning approach was largely the norm for online education in the 1980s.
Chapter Two discussed how from mankind’s earliest days, learning and technology have been profoundly linked; they are kindred spirits, consonant and interconnected. And, linked to collaboration, they enhance the essence of what it means to be human. The four major paradigmatic shifts associated with speech, writing, printing and the Internet illuminate how technology and learning formed the basis for civilizational advances. The invention of the Internet and the Web are transforming our contemporary society, thereby introducing opportunities and motivation for transforming the conditions of learning: how we view learning, and how we can shape our educational practice to better support learning.
 The term incunabula refers to the earliest printed books of a genre, often used exclusively to mean those printed before 1501. It is coined from the Latin word, cunae, meaning ‘cradle’. northwoodsbookshop.com/bkterm.htm
 The prefix hyper (from the Greek prefix meaning ‘over’ or ‘beyond’) signifies overcoming the limitations of linear text by a system that creates linkages and enables multiple pathways through text or media. Hypertext is used to cross-reference collections of data in online documents, software applications, or books and can develop very complex and dynamic systems of linking and cross-referencing. Many consider the most famous implementation of hypertext to be the World Wide Web.