Tuesday, 11 February 2014

Open Hypermedia and the Web

Using Microcosm/Electronics & Computer Science © 1993/CC BY 2.0


Tim Berners-Lee, the main architect of the World Wide Web (W3), developed the system while working for CERN, the European Organisation for Nuclear Research in the late 1980s. W3 was developed to overcome difficulties with managing information exchange via the Internet. At the time finding data on the Internet required pre-existing knowledge gained through various time-consuming methods: the use of specialised clients, mailing lists, newsgroups,hard copies of link lists, and word of mouth.

At CERN, a large number of physicists and other staff needed to share large amounts of data and had begun to employ the Internet to do this. Although the Internet was acknowledged as a valuable means of sharing data, towards the end of the 1980s the need to develop simpler, more reliable methods encouraged the creation of new protocols using distributed hypermedia as a model. 

Developments in Open Hypermedia Systems (OHSS) had gained pace throughout the 80s; a number of stand-alone systems had been prototyped and early attempts at a standardised vocabulary had been made [1]. OHSS facilitate key features: a separation of link databases (‘linkbases’) from documents, and hypermedia functions enabled for third party applications with potential accessibility within heterogeneous environments. 

Two key systems; Hyper-G, developed by a team at the Technical University of Graz, Austria [1], and Microcosm, originating at the University of Southampton [5] were at the heart of pioneering approaches to hypermedia. Like W3, they were launched in 1990, but within 10 years both were outpaced by the formers overwhelming popularity. Ease of use, the management of link integrity and content reference, and the ‘openness’ of the underlying technology were contributing factors to W3's success. However, both Hyper-G's and Microcosm's approach to linking media continue to have relevance for the future development of the Web.

The Dexter Hypertext Reference Model

In 1988 a group of hypertext developers met at the Dexter Inn, New Hampshire to create a terminology for interchangeable and interoperable hypertext standards. About 10 different contemporary hypertext systems were analysed and commonalities between them were described. Essentially each of the systems provided “the ability to create, manipulate, and/or examine a network of information-containing nodes interconnected by relational links.”[6]

The Dexter Model did not attempt to specify implementation protocols, but provided a vital reference model for future developments of hypertext and hypermedia. The Model identified a ‘component’ as a single presentation field which contained the basic content of a hypertext network: text, graphics, images, and/or animation. Each component was assigned a ‘Unique Identifier’ (UID), and ‘links’ that interconnected components were resolved to one or many UIDs to provide ‘link integrity’

The World-Wide Web

By the mid-80s Berners-Lee saw the potential for extending the principle of computer-based information management across the CERN network in order to provide access to project documentation and make explicit the ‘hidden’ skills of personnel as well as the ‘true’ organisational structure. He proposed that this system should meet a number of requirements: remote access across networks, heterogeneity, and the ability to add ‘private links’ and annotations to documents. Berners-Lee's key insights were that ”Information systems start small and grow”, and that the system must be sufficiently flexible to “allow existing systems to be linked together without requiring any central control or coordination”.

His proposal also stressed the different interests of “academic hypertext research” and the practical requirements of his employer. He recognised that many CERN employees were using “primitive terminals” and were not concerned with the niceties of “advanced window styles” and interface design [2]. 

Towards the end of 1990, work was completed on the first iteration of W3, which included a new Hypertext Markup Language (HTML), an ‘httpd’ server, and the Webs first browser, which included an editor function as well as a viewer. The underlying protocols were made freely available and within a few years the technology had been used and adapted by a wide variety of Internet enthusiasts who helped to spread W3 technology to wider audiences.


Aimed at providing solutions to perceived problems in contemporary hypermedia systems, Microcosm was launched as an “open model for hypermedia with dynamic linking” [5] in January 1990. The Microcosm team identified that existing hypermedia systems, although useful in closed settings, did not communicate with other applications, used proprietary document formats, were not easily authored, and as they were distributed on read-only media, did not allow users to add links and annotations. 

While Microcosm used read-only media (CD-ROMs and laser-discs) to host components within an authored environment, it separated these ‘data objects’ from linkbases housed on remote servers. This local area network-based system allowed all users, authors and readers, to add advanced, n-ary (multi-directional) links to multiple generic objects. Microcosm was also able to process a range of documents and had some potential for interoperability due its modular structure, which enabled it to offer a degree of interoperability with W3 browsers [7].

While recognising the significance of W3, the Microcosm team identified some weakness, especially in the manner HTML managed links. Rather than storing links separately, W3 embedded links in documents which resulted in the inability to annotate or edit web documents, and suffered from ‘dangling’ or missing links when documents were deleted or URLs changed. In addition, HTML was limited in how links could be made, there were a small number of allowable tags and only single-ended, unidirectional links could be authored. To counter these link integrity issues the Microcosm team developed the Distributed Link Service (DLS) which enabled the integration of linkbase technology into a W3 environment [3].

Using the DLS, W3 servers could access linkbases and enabled user authored generic as well as specific links. Generic link authoring allows users to create links that connect any mention of phrases within sets of documents, and allows bi-directional links within documents.


Hyper-G offered a number of solutions to the linking issues identified by others working in hypermedia systems development. In a similar manner to Microcosm, Hyper-G stored links in link databases. This allowed users to attach their own links to read-only documents, multiple links to documents or anchors within text or any other media object could be made, users could readily see what objects were linked to, and links could be followed backwards so users could see “what links to what”. Unlike Microcosm, the system use an advanced probabilistic flood (‘P-Flood’) algorithm which managed updates to remote documents and linkbases ensuring link integrity and consistency essentially informing links when documents have been deleted and changed.

Like W3, Hyper-G was a client-server system with its own protocol (HG-CSP) and markup language (HTF). Hyper-G browsers integrated with Internet services W3, WAIS and Gopher, supported a range of objects (text, images, audio, video and 3D environments) and integrated authoring functionality with support for collaboration. 

Hyper-G was a highly advanced system that successfully applied key hypermedia principles to managing data on the Internet. As web usability expert, Jakob Nielsen asserted, it offered “some sorely needed structure for the Wild Web” [8].

Why W3 Won

Despite acknowledged limitations, W3 retained its position as the defacto means of traversing the Internet, and continued to grow and spread its influence. The reasons for this are relatively straightforward. 

W3 was free and relatively easy to use; anyone with a computer, a modem and a phone line could set up their own servers, build web sites and start publishing on the Internet without having to pay fees or enter into contractual relationships.

Although limited in terms of hypermedia capability, these shortcomings were not serious enough to prevent users taking advantage of its data sharing and simple linking functions. Dangling links could be ignored, as search engines allowed users to find other resources, and improved browsers allowed users to keep track of their browsing history, and backtrack through visited pages. 

In contrast, Microcosm and Hyper-G were developed, in their early stages at least, as local systems. This enabled them to employ superior technology to manage complex linking operations much more effectively than W3. However, this focus led to systems that were significantly more complex to manage than W3, and presented difficulties for scaling up to the wider Internet. In addition it was not clear which parts, if any, were free for use. Both systems promoted commercial versions early in their development which had the unintended effect of stifling adoption beyond an initial core group of users.

Future directions

W3 has developed into a sophisticated system that provides many of the functions of an open hypermedia system that were lacking in its early stages of development. Attempts to integrate hypermedia systems with W3 [3],[4],[9] and find solutions to linking and data storage issues influenced the development of the open standard Extensible Markup language (XML) and XPath, XPointer and XLink syntaxes. While HTML describes documents and the links between them, XML contains descriptive data that add to or replace the content of web documents. XPath, XPointer and XLink describe addressable elements, arbitrary ranges, and connections between anchors within XML documents respectively.

XML may be combined with Resource Description Framework (RDF) and Web Ontology Language (OWL) protocols to store descriptive data that produce web content in more useful ways than with simple HTML. These protocols allow web content to be machine-readable, allowing applications to interrogate data and automate many web activities that have previously only been executable by human readers. These protocols are seen as precursors for the ‘Semantic Web’, a new development of W3 that links data points with multi-directional relationships rather than uni-directional links to documents [10].


[1] Keith Andrews, Frank Kappe, and Hermann Maurer. The Hyper-G Network Information System. In J. UCS The Journal of Universal Computer Science, pages 206–220. Springer, 1996.
[2] Tim Berners-Lee. Information Management: A Proposal. CERN, 1989.
[3] Les A Carr, David C DeRoure, Wendy Hall, and Gary J Hill. The Distributed Link Service: A Tool for Publishers, Authors and Readers. 1995.
[4] Hugh Davis, Andy Lewis, and Antoine Rizk. Ohp: A Draft Proposal for a Standard Open Hypermedia Protocol (Levels 0 and 1: Revision 1.2-13th March. 1996). In 2nd Workshop on Open Hypermedia Systems, Washington, 1996.
[5] Andrew M Fountain, Wendy Hall, Ian Heath, and Hugh C Davis. Microcosm: An Open Model for Hypermedia with Dynamic Linking. In ECHT, pages 298–311, 1990.
[6] Frank Halasz, Mayer Schwartz, Kaj Grønbæk, and Randall H Trigg. The Dexter Hypertext Reference Model. Communications of the ACM, 37(2):30–39, 1994.
[7] Wendy Hall, Hugh Davis, and Gerard Hutchings. Rethinking Hypermedia: the Microcosm Approach, Volume 67. Kluwer Academic Publishers Dordrecht, 1996.
[8] Hermann Maurer. Hyperwave - The Next Generation Web Solution, Institute for Information Processing and Computer Supported Media, Graz University of Technology, [Online: http://www.iicm.tugraz.at/hgbook Accessed 5 December 2013].
[9] Dave E Millard, Luc Moreau, Hugh C Davis, and Siegfried Reich. Fohm: A Fundamental Open Hypertext Model for Investigating Interoperability Between Hypertext Domains. In Proceedings of the Eleventh ACM on Hypertext and Hypermedia, pages 93–102. ACM, 2000.
[10] Nigel Shadbolt, Wendy Hall, and Tim Berners-Lee. The Semantic Web Revisited. Intelligent Systems, IEEE, 21(3):96–101, 2006.

Monday, 10 February 2014

MSc Web Science - Week 19

Reading/Sam Howzit © 2012/CC BY 2.0

This week's readings:

COMP6047 - Further Web Science

COMP6048 - Interdisciplinary Thinking

  • Repko A. F. (2008) Interdisciplinary Research: Process and Theory. Sage Publications. Chapters 3 and Chapter 4
  • Plus 6 readings for first group work.

COMP6050 - Semantic Web for Web Science

COMP6052 Social Networking Technology

RESM6003 - Qualitative Methods

Wednesday, 5 February 2014

5 interactions between the Web and Education that are changing the way we learn

Using MACs in the Computer Laboratory/University of Exeter ©2008/CC BY 2.0
The way we learn and the tools we use to extend our capacity for learning have always been closely interrelated. Over 2000 years ago wax tablets enabled learners to show their working, 500 years ago the introduction of movable type made books more accessible, 150 years ago the postal system provided the infrastructure for distance education, the introduction of radio and television services established the means for widespread educational initiatives, and personal computers and portable video making equipment were widely adopted by educators in the 1970s and 80s. Since the emergence of the Web 25 years ago, both learners and educators have exploited the potential of the underlying technologies and the services developed with them to support and change the way we think about learning in many fundamental ways.

1. Technology 

The educational value of the Internet was recognised at its inception and computing science academics working in universities and colleges keenly adopted the technology to share data among themselves and with their students. However, as the number of resources hosted on networked computers increased they tended to become ‘siloed’ and difficult to find. The invention of the Web fundamentally changed this environment and the way people interacted with the Internet. The underlying protocols that govern the way the Web works are based on linking electronic documents over disparate networks using web browser applications. By making the protocols open to everyone at no cost the Web’s founders allowed people to build upon the technology, for example one of the earliest adaptations introduced the search function that enables users to discover resources significantly easier than with earlier technologies.

In the mid-1960s Gordon Moore identified an interesting fact about the processing power of computers – that it appeared to double every two years. Once this filtered through to computing hardware manufacturers and as the demand for personal computers increased, this became something of a self-fulfilling prophesy – one that has led to the development of ever sophisticated, ever smaller, less expensive computing devices. From laptop computers to smartphones to tablets to Google Glass and Radio-frequency identification (RFID) devices, this phenomenon has placed powerful, mobile computing into the hands of more than 1.5 billion people worldwide allowing learners and educators to access significantly more information than has been available to any previous generation.

2. The Evolving Web

The early Web gave learners and educators a taste of what could be achieved in this new environment. Learners could access information that had previously been ‘hidden’ in libraries and archives and educators were able to either convert existing instructional progammes, quizzes and exams into Web-enabled resources or develop new assets that guided learners through a set of learning objectives. But this essentially static, ‘read-only’ Web allowed little opportunity for learner interaction, collaboration and sharing, all vital components of the learning process. This began to change with the introduction of Wiki’s in the mid-90s.

These web applications enable users to comment on or change the text on a web page that had been written by others, and provide a platform for group collaboration and sharing. In addition to inspiring the creation of the global knowledge bank that is Wikipedia, wiki’s encapsulated many of the features of a ‘read, write and execute’ web - what is commonly referred to as Web 2.0. 

The ability to readily create a presence on the Web via blogs, social networking, and video sharing sites has created a dynamic resource that continues to make radical changes to our learning and teaching experience. Web 2.0 applications have been embraced by learners and educators at all levels. YouTube and other video sharing sites provide a platform for user-generated how-to videos, software advice, and exemplars of arts and science disciplines (e.g. The LXD: TED Talk, Periodic Videos and Khan Academy), that inform and inspire millions of informal learners as well as students in formal education. The social networking site, Facebook is used by teachers to facilitate collaborative group work (e.g. in Music Technology at Bridgend College), and a large number of user-generated resource sharing sites (e.g. Flickr, SlideShare and Storify) and cloud computing services (e.g. Google Drive, WeVideo, and Pixlr) enable learners and educators to extend their tools and resources beyond the traditional classroom.

3. Theory

The network of collaborative and productive spaces enabled by Web 2.0 has inspired an invigoration of constructivist educational theory and its application to a range of online learning spaces. Learners and educators are able to communicate, provide feedback and collaborate in order to co-create the learning process using a variety of free-to-access synchronous and asynchronous technologies. 

In constructivist theory learning takes place primarily through interaction between learners and between learners and teachers. Teachers assess the suitability of technologies in various settings and judge what are called their affordances for learning, that is, the essential features of a technology and what the interface allows learners to do. For example the affordances of Facebook may be the opportunities to support collaboration, a shared group identity and understanding of knowledge. Once the teacher is familiar with the environments they can orchestrate learning in a manner that supports learners through the process (i.e. ‘scaffolding’).

The Web has also revived interest ‘autonomous education’, highlighted by interest in the ‘Hole in the Wall’ experiments undertaken by Professor Sugata Mitra in the late 90s. These experiments involved observing children’s use of Web-connected computers placed in open spaces in rural settings in India and demonstrated that children were able to learn how to use the devices, to find information and teach others how to use the computers without any instruction or guidance. 

While supporting opportunities for self-learning, the Web also provides a platform for delivering timely instruction and feedback that can shape learning outcomes using operant conditioning methods. This approach to teaching is based on behaviourist theory which claims that learning can be reinforced through the use of rewards and punishments. In Web-based learning environments this is normally applied through the use of ‘gamification’ techniques such as the awarding of virtual badges for achievement or through the provision of a visual indication of learner progress (e.g. a ‘progress bar’).

4. Pedagogy

New technologies inspire new approaches to teaching, and the Web has made a huge impact in this area. Formal education has adopted new approaches including the use of Virtual Learning Environments (VLEs), e-Porfolios, and Massive Open Online Courses (MOOCs) which support new blended learning methods. Course materials, formative assessments, lecture recordings (including video, audio and synchronised slides), and assignment information and submission form the backbone of VLEs used in most educational institutions. In addition, many institutions encourage their students to develop their own ePortfolios – a self-edited collection of coursework, blog posts and other educational activity that reflects the students’ progress, experience and knowledge gained during their time at a university of college. These are often integrated with (although kept separate from) the more formal VLE, and the institutions’ Careers Service and used as an addition to a students’ Higher Education Achievement Record.

VLEs are primarily used to support ‘bricks and mortar’ educational, they are not viewed as a replacement for class-based learning, but are ‘blended’ with traditional methods. MOOCs on the other hand appear to be heralding a paradigm shift in the delivery of formal learning. This relatively new web-based form of distance learning emerged in 2008 and has its antecedents in Open Educational Resources initiatives. MOOCs typically provide opportunities for an unlimited number of learners to experience a short college or university level module (normally around 6 weeks in length), delivered using synchronous and asynchronous tutorials, web-based video, readings and quizzes. At the end of the course learners are required to produce some form of relevant feedback that demonstrates their achievement, which is then assessed by their course peers or course tutors.

5. Openness

The early decision to open Web technologies for all was inspired by research sharing practices in academia, and as the Web has developed it has been used as a platform for sharing ideas, research and teaching. Open Access to research papers that have traditionally published by academic journals and available at a high premium, has the potential to transform learning and research. Making academic research available to everyone via the Web provides opportunities for wider access to learning for the poor and those living in rural areas, and improves the uptake of research outputs. 

Similarly Open Educational Resource initiatives are providing opportunities for teachers to share teaching materials, allowing others to reuse and repurpose content. Issues regarding ownership of content have been overcome in many instances through the use of Creative Commons licenses – a scheme that allows content owners to clearly show how they would like others to use their material.

The increasing ubiquity of Web technologies combined with the culture of openness promoted by its founders of the Web, and increasing availability of low cost Web-enabled devices are transforming opportunities for learning and teaching, and are changing the way education is perceived. Despite inequality of access, the ‘digital divide’ and ‘web literacies’, the opportunities for accessing education are greater today largely due to the Web.

Encouraging a corporate open data culture: An interdisciplinary approach to assessing risk and uncertainty in the hydrocarbon exploration industry


The Royal Society’s influential paper on the use and misuse of risk analysis asserts that “[a]ny corporation, public utility or government will react to criticism of its activities by seeking…new ways to further the acceptable image of their activities” (Pearce, Russell & Griffiths, 1981). In the past decade the timely availability of relevant data has become widely acknowledged as having “a huge potential benefit” to the practice of risk assessment and management (Hughes, Murray, & Royse, 2012). Partly in response to climate change concerns the importance of access to data is acknowledged at a local, national and international level. To enable and encourage the wider use of public environmental and health related data, initiatives like the European Union’s INSPIRE Directive are establishing standardised, legally enforceable data infrastructures (European Union, 2014), and many governments have adopted ‘open data' strategies (e.g. UK Government, 2014; Google, 2011).

While the benefits of open data has been recognised and is being acted on in the public realm, despite the good intentions of some corporations (Ghafele & O’Brien, 2012, Alder, 2014) most commercial organisations have been slow to respond. The principle barriers to data sharing in the corporate sector have been identified as resulting from concerns over intellectual property, commercial confidentiality, and ‘cultural’ issues. While not offering any actionable recommendations to tackle these issues, the UK Government’s recent ‘Foresight’ Review asserts that “a more holistic approach to risk analysis…is undoubtedly needed” (Hughes et al, 2012).

Risk analysis and management of uncertainty demand an interdisciplinary approach (Rougier et al., 2010: 4) and the purpose of this essay is to follow this course and explore the social science disciplines of Anthropology and Economics in order to propose a combined approach that includes relevant methods from both fields. While the evolution of these disciplines has followed different trajectories, and underlying methodological differences can be identified, the increasingly blurred boundaries within science ensure that the identification of discrete ontologies is problematic. The move towards transdisciplinarity involving as it does the sharing of research tools and theoretical perspectives, and the emergence of new multidisciplinary fields (e.g. economic anthropology) provides a fertile field for developing ‘Mode 2’ research propositions (Nowotny, 2001).

Specifically, this essay explores the factors influencing data sharing in the hydrocarbon exploration industry (HEI) where potential exists for the timely publication of data gathered from monitoring hydraulic fracturing activity.


Hydraulic fracturing, more widely known as ‘fracking’, is a technique that has been used to release and collect methane gas from shale rock for more than 60 years. The fracking process employs explosive charges and specially formulated chemical fluids pumped under high pressure to help release gas for extraction. This process takes place more than 1,500m below ground level, at a significantly greater depth than typical coal mining activities (Mair et al, 2012; Wood, 2012). The British Geological Survey estimate that “resources of 1,800 to 13,000bcm [billion cubic metres]”, the equivalent of more than 23 years supply at current UK consumption rates, are “potentially recoverable” from sites in northern and southern England (POSTbox, 2013). However exploration is required in order to discover if this potential is realisable. 

Public concerns about fracking focus on the possibility of increased seismic activity, leakage of chemical contaminants into the water table, air pollution caused by the leakage of methane, and the continuing reliance on carbon resources with potentially harmful effects on the world’s climate (Mair, et al, 2012; Kibble et al, 2013; Ricketts, 2013). These concerns have been expressed in public demonstrations against the process (The Guardian, 2013), and the introduction of moratoria on exploration in a number of countries. These public expressions of concern are viewed by the HEI as a significant additional risk to an already hazardous enterprise (Wood, 2012).

In the UK, all industrial activities are subject to health and safety audits and some involve continuous, around the clock monitoring. For example in the HEI, Cuadrilla Resources commission Ground Gas Solutions Ltd. to provide monitoring services (Cuadrilla, 2013) which aim to: “…provide confidence to regulators, local communities and interested third parties that no environmental damage has occurred.” (GGS Ltd., 2013). Some of this data are made public via reports to regulatory authorities which can be subject to significant delay, are written in formal, technical language, and are not easily accessed by the general public (Boholm, 2003: 172). This essay proposes a interdisciplinary research methodology to explore the potential for allowing open access to real time (or close to real time) monitoring data that could help to alleviate some public concerns.


Whether analysing large scale issues of national or global significance (macroeconomics) or focussing on the actions of individuals and local groups (microeconomics), the study of economics is defined by its evaluation of human behaviour in relation to the exploitation and control of scarce resources. In all disciplines there are varieties of opinion on the efficacies of different theories; in economics this can be illustrated by reference to the divergent theories regarding government intervention in markets advocated by Keynesian economists and those following the Chicago School. In practice economists prioritise their research by balancing the availability of data and the effectiveness of its collection against the needs of their audience (e.g. government agencies and corporations) and the strength of their beliefs in the determining factors that influence the behaviour of individuals in society (Kuznets, 1978). For example, when seeking solutions to economic depression a Keynesian may advocate increased government spending, whereas a Chicago School economist would suggest increased money supply, allowing a free market to correct itself.

Key concepts in economics include the evaluation of the cost and benefits of future economic activity and the maximisation of utility. Predicting the outcomes of activities with varying levels of uncertainty involve the collection of relevant data, risk analysis and the evaluation of statistical probability. In high-risk investment industries the effective collection and analysis of data is vital, not least in hydrocarbon exploration, where the large rewards for discovering untapped, scarce resources are balanced by the huge investments involved in exploration. The assessment of risk plays a significant part in evaluating the potential costs and economic value of recoverable hydrocarbon resources and multidisciplinary teams comprising geologists, statisticians, legal experts, engineers and economists are engaged within the HEI to ensure that rational choices are made, resources are used to their full potential and that risk is kept ‘as low as reasonably practical” (HSE, 2014). A range of complex and exhaustive appraisal models are used in evaluation, the core aims being to use data as efficiently as possible and minimise subjectivity in order to reduce uncertainty when ascertaining the economic risks and rewards (Nederlof, 2014).

The evaluation process can be broken down into three key stages: 

  • Resource evaluation. This is normally undertaken using a "petroleum system model" and is based on the assumption of five independent geological processes that facilitate hydrocarbon accumulation: generation, migration, entrapment and retention and recovery (Häntschel & Kauerauf, 2009). Data for each of these processes are collected using a range of tools (e.g. Geographic Information Systems software) (Hood et al, 2000). 
  • Monte Carlo statistical analysis. This uses computer-based statistical analysis tools (e.g. Palisade Corporation, 2014) to process input variables many thousands of times using different random choices to create vectors of equally probable outcomes. A typical output from this process is a range of expectation curves which display the predicted outcomes in ascending order of probability (Nederlof, 2014).
  • Economic appraisal. Essentially this involves translating the predicted amount of recoverable resources into a cash value. Considerations of the value of the resource need to account of inflation, predicted future prices, regulation, safety, health and environmental considerations and exploitation contracts and licences. All of these factors are subject to variations over time (e.g. possibility of a future ‘windfall tax’) and economists typically provide a number of alternative scenarios indicating the probabilities arising from the interplay of different variables (Haldorsen, 1996).

While the statistical analysis of this detailed mesh of quantitative data is a powerful tool in helping decision makers in the HEI, economists understand that care must be taken in reaching definitive conclusions and in making predictions. A key concern is that primary data may be treated without a suitable understanding the historical background, conventions and collection practices that influence the production of this data (Fogel, Fogel, Guglielmo & Grotte, 2013: 96). An appreciation of the contribution of anthropological research may be helpful is this area. 


Although anthropologists “cast their net far and wide” (Eriksen, 2004: 45) in order to provide context for their observations, their work is undertaken primarily through close interaction with individuals and the groups they inhabit. In-depth, structured interviews are used extensively and the key research method is ‘participant observation’ – the goal being to extensively record everyday experiences as an aid to gaining new knowledge on the existence (or otherwise) of ‘human universals’ (shared characteristics).

Developing from the study of ‘exotic’ cultures in the 19th and early 20th century, it is perhaps inevitable that with a field as large as the scientific study of humanity at all times and in all places would branch into a heterogeneous collection of sub-disciplines - ‘urban anthropology’,  ‘design anthropology’, ‘theological anthropology’, ‘digital anthropology’, and so on.  Although there probably is an ‘anthropology’ for every area of human activity, each with its own unique ontology, the features that distinguishes this social science from other, similar, disciplines (e.g. sociology) resides primarily in its approach to data collection and interpretation. Unlike researchers in most other disciplines, anthropologists immerse themselves within the social and cultural life of their subjects, living closely ‘in the field’ with the people they are studying. The purpose is to attempt to see the world from the subjects’ point of view, and to provide a rich, contextualised, ‘thick’ description and localised interpretation of this perspective (Geertz, 1994: 140). 

Data collection follows a systematic approach which typically focuses on particular fields of study, primarily: kinship, reciprocity, nature, thought and identification. For example an anthropologist may explore how the community they are researching view reciprocity; how gifts are exchanged, goods are paid for, and how the community view property, as well those things that cannot be exchanged or given away (Weiner, (1992: 33). Comparisons can then be made between groups with a view to establishing and understanding similarities and differences, and ultimately identifying characteristics which are unique to specific societies and those that are universally shared (Goodenough, 1970). 

Within the terms of this essay,  perhaps the most appropriate sub-discipline to explore in some detail is where anthropologists are commissioned by commercial organisations to describe and analyse ‘organisational culture’ – what is typically referred to as ‘organisational anthropology’. Anthropologists working in the commercial sector are usually engaged in ‘problem-oriented’ research, attempting to uncover the root of human relations issues identified by corporate leaders (Catlin, 2006). Within this environment they apply anthropological methodologies to particular fields of interest, for example: work processes, group behaviour, organisational change, consumer behaviour, product design and the effects of globalisation and diversity (Jordan, 2010). The focus of this research is placed on talking with employees and management to reach descriptions and interpretations of the overall culture as well as any existing sub-cultures, with the aim of providing recommended courses of action that are relevant to the organisations’ strategic goals. 

In addition to work in the corporate sector the anthropologists’ practice of long-term engagement is also useful to public policymakers where collected data can be extremely useful in tracking changes in over extended periods of time (Perry, 2013). Within the HEI, anthropologists explore the relationships between companies, state organisations and communities (Stammler & Wilson, 2006), the cultural implications of the regulation of risk (Kringen, 2008), the environmental impact on communities and their resilience to exploration (Buultjens 2013),  as well as land use and the social organisation of the workforce (Godoy, 1985).

Finally, Monte Carlo analysis is not simply the preserve of economic analysts. The method is used in other social sciences including social anthropology (Tate, 2013), linguistics (Klein, Kuppin & Meives, 1969), education (Pudrovska & Anishkin, 2013) and public health studies (Morera & Castro, 2013) and applied to statistical analysis when evaluating and predicting incomplete or missing data.

Proposal for an interdisciplinary approach

This essay has explored the relevant theories and research themes that influence those involved in economic decisions in the HEI and how anthropology approaches the study of cultures. A key element in the context of this essay is the evaluation of risk: how does the HEI balance risk and reward in the search for scarce, economically recoverable resources, and what can anthropology offer in understanding the human perception of risk. Central to the risk question, both evaluation and perception, is how data is used to aid economic decision making on the part of corporations, and to enable society to compare potential hazards and manage health and safety, and environmental concerns. 

When experts analyse risk in the HEI, the terms they use to define the costs and benefits of a particular course of action are highly relevant to decision makers, but may have little meaning to “people in social settings” (Boholm, 2003: 166). While the maximisation of utility through rational choices motivates the statistical analysis of potential hydrocarbon fields, from the anthropology perspective this approach fundamentally misrepresents the essentially cultural construction of risk perception (Bourdieu, 2005: 215) and has “limited relevance for explaining how people think and act in situations where there is an element of uncertainty” (Boholm, 2003: 161). 

Although generally useful, there are two essential problems with this approach. Firstly, anthropologists are divided on the concept of ‘culture’. In its plural form it can be seen as divisive and not conducive identifying human universals, definitions of ‘culture’ are often vague and do not acknowledge the permeability of boundaries in human society, or the possibilities for internal variation (Hannerz, 1992: 13). Secondly, when explaining ideas of risk and hazard, anthropology tends to favour definitions based on objective social phenomenon (e.g ‘taboo’ in traditional societies is viewed as a means of maintaining social order - Tansey and O'Riordan 1999: 74) rather than an individuals’ subjective consideration of risks based on available evidence (Slovic, 1987: 280). However, by taking care when making generalised statements regarding ‘culture’ and by exploring how people “identify, understand and manage uncertainty in terms of knowledge of consequences and probabilities of events” (Boholm, 2003: 166) - and by acknowledging the relevance of expert risk analysis, a consensus definition of risk can be expressed as: “a situation or event where something of human value (including humans themselves) has been put at stake and where the outcome is uncertain” (Rosa 1998: 28).

Managing risks at both a corporate and community level entails the timely communication of relevant data in a form that can be readily understood by all parties. In the current setting economic analysis can provide some highly relevant expert insight into risk in the HEI, and anthropological research can describe and interpret the context of the perception and consideration of risk and uncertainty. 

In essence this combined approach would involve primary anthropological research methods including in-depth structured interviews, and participant observation within the HEI and affected communities. The outputs of these studies would be used to inform a more nuanced approach to uncertainty and risk in economic modelling and the use of computational methods (including Monte Carlo analysis) to predict the effects of social vulnerability and environmental protest activity on hydrocarbon exploration. By adopting this form of research methodology it is proposed that an effective approach to communicating risk can be formulated which may encourage a more transparent publication of data and help the HEI “to further the acceptable image of their activities” (Pearce, et al., 1981).


Alder S., 2014. Now here’s what you could really create with open data. Telefónica Digital Hub. [Online] Available at: http://blog.digital.telefonica.com/2013/09/15/open-data-telefonica-dynamic-insights/ [Accessed 4 January 2014].

Boholm A., 2003. The cultural nature of risk: Can there be an anthropology of uncertainty? In Ethnos: Journal of Anthropology, 68:2, 159-178. [Online] Available at: http://www.tandfonline.com/doi/abs/10.1080/0014184032000097722#.Us6-TtJdWSq [Accessed 6 January 2014].

Bourdieu, P., 2005. The Social Structures of the Economy, Cambridge: Polity. [Online] Available at: http://goo.gl/mjV5HL [Accessed 8 January 2014].

Buultjens, J., 2013. Introduction - Special Edition: The Economic and Social Policy Implications of the Coal Seam Gas (CSG) Industry in Australia. In Journal of Economic and Social Policy, 15:3. [Online] Available at: http://epubs.scu.edu.au/jesp/vol15/iss3/1 [Accessed 6 January 2014].

Catlin, L. B., 2006. What in the world is an organizational anthropologist? Claymore Associates, Inc. [Online] Available at:  http://www.lindacatlin.com/whatintheworld.pdf [Accessed on 8 January 2014].

Cuadrilla, 2013. Balcombe, West Sussex. [Online] Available at: http://www.cuadrillaresources.com/our-sites/balcombe/ [Accessed 13 October 2013].

Eriksen, T. H., 2004. What is Anthropology? London: Pluto Press.

European Union, 2014. INSPIRE: Infrastructure for Spatial Information in the European Community. [Online] Available at: http://inspire.jrc.ec.europa.eu/ [Accessed 3 January 2014].

Fogel, R. W., Fogel, E. M., Guglielmo, M., & Grotte, N., 2013. Political Arithmetic: Simon Kuznets and the Empirical Tradition in Economics. Chicago and London: University of Chicago Press: [Online] Available at: http://lib.myilibrary.com/Open.aspx?id=484552 [Accessed 2 January 2014].

Geertz, C., 1994. Thick description: Toward an interpretive theory of culture. In Readings in the philosophy of social science, 213-231. [Online] Available at: http://civirtual.comunicamos.org/wp-content/uploads/group-documents/4/1363016634-denzin_lincoln_turning_points_qual_research_2003.pdf#page=152 [Accessed 5 January 2014].

Ghafele, R. D. & O’Brien, R., 2012. Open innovation for sustainability: Lessons from the GreenXchange experience. In: International Centre on Trade and Sustainable Development Policy Brief No. 13, 1-10. [Online] Available at: http://mpra.ub.uni-muenchen.de/40440/ [Accessed 30 December 2013].

Godoy, R., 1985. Mining: anthropological perspectives. In Annual Review of Anthropology, 14, 199-217. [Online] Available at: http://www.jstor.org/stable/2155595 [Accessed 7 January 2014].

Goodenough, W. H., 1970. Description and comparison in cultural anthropology. New York: Press Syndicate of the Cambridge University. [Online] Available at: http://goo.gl/ze8ADQ [Accessed 6 January 2014].

Google, 2011. World Map of Government Open Data Initiatives. [Online] Available at maps.google.com/maps/ms?ie=UTF8&oe=UTF8&msa=0&msid=105833408128032902805.00048bfbba4ecb314e822 [Accessed 3 January 2014].

GGS Ltd., 2014. Ground Gas Solutions Limited home page [Online] Available at: http://ground-gassolutions.co.uk/ [Accessed 13 October 2013].

The Guardian, 2013. Fracking protesters gather for six-day camp as Balcombe drilling suspended, 16 August 2013. [Online] Available at: http://www.theguardian.com/environment/2013/aug/16/fracking-protesters-camp-balcombe-drilling [Accessed 10 October 2013].

Haldorsen, H. H., 1996. Choosing between rocks, hard places and a lot more: the economic interface. In Norwegian Petroleum Society Special Publications, 6, 291-312. [Online] Available at: http://www.sciencedirect.com/science/article/pii/S0928893707800257 [Accessed 30 December 2013].

Hannertz, U. 1992. Cultural Complexity. New York: Colombia University Press. [Online] Available at: http://goo.gl/1jV7c7 [Accessed 15 October 2013].

Häntschel, T., and Kauerauf, A. I., 2009. Fundamentals of Basin and Petroleum Systems Modeling. Springer: Berlin Heidelberg. [Online] Available at: http://link.springer.com/book/10.1007%2F978-3-540-72318-9 [Accessed 29 December 2013].

Health and Safety Executive (HSE), 2014. ALARP – As low as reasonably practicable. [Online] Available at: http://www.hse.gov.uk/comah/alarp.htm [Accessed 10 January 2014].

Hood, K. C., South, B. C., Walton, F. D., Baldwin, O. D., & Burroughs, W. A., 2000. Use of geographic information systems in hydrocarbon resource assessment and opportunity analysis. In Geographic information systems in petroleum exploration and development: AAPG, Computer Applications in Geology, 4, 173-185. [Online] Available at: http://archives.datapages.com/data/specpubs/ca04/chap12/ca04ch12.htm [Accessed 29 December 2013].

Hughes, R., Murray, V., & Royse, K., 2012. Data sharing: Commissioned Review, Foresight, Government Office for Science: London UK. [Online] Available at:  http://nora.nerc.ac.uk/20726/ [Accessed 29 December 2013].

Jordan, A. T., 2010. The importance of business anthropology: its unique contributions. In International Journal of Business Anthropology, 1:1, 15-25. [Online] Available at: http://www.na-businesspress.com/IJBA/JordanWeb.pdf [Accessed 7 January 2014].

Kibble, A., Cabianca, T., Daraktchieva, Z., Gooding, T., Smithard, J., Kowalczyk, G., McColl, N. P., Singh, M., Vardoulakis, S. and Kamanyire, R, 2013. Review of the potential public health impacts of exposures to chemical and radioactive pollutants as a result of the shale gas extraction: Draft for Comment. Centre for Radiation, Chemical and Environmental Hazards, Public Health England. [Online] Available at: http://www.hpa.org.uk/Publications/Environment/PHECRCEReportSeries/1310Reviewofthepotentialhealthimpactsshalegas/ [Accessed 5 January 2014].

Klein, S., Kuppin, M. A., & Meives, K. A., 1969. Monte Carlo simulation of language change in Tikopia & Maori. In Proceedings of the 1969 conference on Computational linguistics 1-27. Association for Computational Linguistics. [Online] Available at: http://ftp.cs.wisc.edu/pub/techreports/1969/TR62.pdf [Accessed 8 January 2014].

Kringen, J., 2008. Culture and control: Regulation of risk in the Norwegian Petroleum Industry. PhD. University of Oslo. Available at: https://www.duo.uio.no/handle/10852/17876 [Accessed 4 January 2014].

Kuznets, S., 1978. Problems of Quantitative Research in Economics. In Philippine Review of Economics, 15:1. [Online] Available at: http://pre.econ.upd.edu.ph/index.php/pre/article/view/416 [Accessed 29 December 2013].

Mair, R., Bickle, M., Goodman, D., Koppelman, B., Roberts, J., Selley, R. & Younger, P., 2012. Shale gas extraction in the UK: a review of hydraulic fracturing. The Royal Society and The Royal Academy of Engineering. [Online] Available at: raeng.org.uk/shale [Accessed 29 December 2013].

Morera, O. F., & Castro, F. G., 2013. Important Considerations in Conducting Statistical Mediation Analyses. American journal of public health, 103:3, 394-396. [Online] Available at: http://ajph.aphapublications.org/doi/abs/10.2105/AJPH.2012.301047?journalCode=ajph [Accessed 8 January 2014].

Nederlof, M. H., 2014. Geology and Energy Analysis. [Online] Available at: http://www.mhnederlof.nl/ [Accessed 28 December 2013].

Nowotny, H., 2001. The transformation society, from Nowotny, H. Re-thinking science : knowledge and the public in an age of uncertainty. Cambridge : Polity Press, 1-20. [Online] Available at: https://www.dd.library.soton.ac.uk/HL/EDUC6236/00508795.pdf [Accessed 5 January 2014].

Palisade Corporation, 2014. @RISK Helps Newly-deregulated Eastern European Power Market Meet EU Standards. [Online] Available at: https://www.palisade.com/cases/Transelectrica.asp [Accessed 30 December 2013].

Pearce, D. W., Russell, S., & Griffiths, R. F., 1981. Risk assessment: Use and misuse [and discussion].In Proceedings of the Royal Society of London. A. Mathematical and Physical Sciences, 376:1764, 181-192. [Online] Available at: http://rspa.royalsocietypublishing.org/content/376/1764/181.full.pdf [Accessed 30 December 2013].

Perry, S. L., 2013. Using Ethnography to Monitor the Community Health Implications of Onshore Unconventional Oil and Gas Developments: Examples from Pennsylvania's Marcellus Shale. In NEW SOLUTIONS: A Journal of Environmental and Occupational Health Policy, 23:1, 33-53. [Online] Available at: http://www.ncbi.nlm.nih.gov/pubmed/23552647 [Accessed 5 January 2014].

POSTbox (Parliamentary Office of Science and Technology), 2013. UK Shale Gas Potential: Shale Gas Resource and Reserve Estimates. [Online] Available at: http://www.parliament.uk/documents/post/ShaleGas_POSTbox.pdf [Accessed 5 January 2014].

Pudrovska, T., & Anishkin, A., 2013. Clarifying the Positive Association Between Education and Prostate Cancer A Monte Carlo Simulation Approach. In Journal of Applied Gerontology. [Online] Available at: http://jag.sagepub.com/content/early/2013/01/30/0733464812473798.abstract [Accessed 8 January 2014].

Ricketts, A., 2013. Investment Risk: An Amplification Tool for Social Movement Campaigns Globally and Locally. In Journal of Economic and Social Policy: 15:3. [Online] Available at: http://epubs.scu.edu.au/jesp/vol15/iss3/4 [Accessed 4 January 2014].

Rosa, E. A., 1998. Metatheoretical Foundations for Post-Normal Risk. In Journal of Risk Research, 1:1, 15–44. [Online] Available at: http://www.tandfonline.com/doi/pdf/10.1080/136698798377303 [Accessed 8 January 2014].

Rougier, J., Sparks, S., Aspinall, W., Cornell, S., Crosweller, S., Edwards, T., Freer, J., Hill, L., & Hincks, T., 2010. SAPPUR: NERC Scoping Study on Uncertainty and Risk in Natural Hazards. Summary and recommendations. Bristol Environmental Risk Research Centre (BRISK), University of Bristol, UK. [Online] Available at http://www.nerc.ac.uk/research/programmes/pure/documents/sappur-summary-report.pdf [Accessed 3 January 2014].

Slovic, P., 1987. Perception of Risk. In Science, New Series, 236:4799, 280-285 [Online] Available at:  http://www.uns.ethz.ch/edu/teach/0.pdf [Accessed 9 January 2014].
Stammler, F., & Wilson, E., 2006. Dialogue for Development: An Exploration of Relations between Oil and Gas Companies, Communities, and the State. In Sibirica, 5:2, 1-42. [Online] Available at: http://dx.doi.org/10.3167/136173606780490739 [Accessed 5 January 2014].

Tansey, J. & O’Riordan, T., 1999. Cultural Theory and Risk: A Review. In Health, Risk & Society, 1:1, 71–90. [Online] Available at: http://paul-hadrien.info/backup/LSE/IS%20490/utile/cultural%20theory%20and%20risk%20review.pdf [Accessed 8 January 2014].

Tate, E., 2013. Uncertainty analysis for a social vulnerability index. In Annals of the association of American geographers, 103:3, 526-543. [Online] Available at: http://www.tandfonline.com/doi/abs/10.1080/00045608.2012.700616#.Us2S3_RdV8H [Accessed 8 January 2014].

UK Government, 2014. DATA.GOV.UK: Opening up Government. [Online] Available at: http://data.gov.uk/ [Accessed 3 January 2014].

Weiner, A. B., 1992. Inalienable possessions: The paradox of keeping-while-giving. University of California Press. [Online] Available at: http://goo.gl/kpbMJd [Accessed 7 January 2014].

Wood, J., 2012. The global anti-fracking movement: what it wants; how it operates and what’s next. Control Risks. [Online] Available at: http://www.controlrisks.com/Oversized%20assets/shale_gas_whitepaper.pdf [Accessed 5 January 2014].

How does a ‘social science’ or ‘philosophy of science’ perspective on science and technology inform Web Science?

'A Manifesto for Web Science'' (Halford, Pope and Carr, 2010) defines the essential characteristics of this relatively new area of study; Web Science “must be a critical discipline” that “looks both ways to see how the web is made by humans and how humans are made by the web”. This broadly socio-technical approach is derived from studies that critically respond to widely-held ‘positivist’ accounts of ‘Normal Science’. These accounts depict the practice of science as a systematic means of discovering facts about the natural world that inevitably progresses toward improved understandings, and technology as a reasonably uncomplicated application of these discoveries. This classical empiricist argument presents a straightforward worldview within which disinterested scientists seek to objectively develop a body of proven knowledge; they make observations, establish hypotheses, collect data and use these to establish new theories. Technologists play a secondary, essentially pragmatic role and “identify needs, problems, or opportunities, and creatively combine pieces of knowledge to address them” (Sismondo, 2010: 8). 

Criticism of this view emerged in the mid-twentieth century, developing Hume’s critique of inductive reasoning (Hume, 1748 [2007]: 26) to establish new ways of thinking about the practice of science and the relationship between science, technology and society.  Originating from theories of falsification (Popper, 1959), ‘epistemological anarchism’ (Feyerabend, 1975) and theories expounding ‘paradigms’ and ‘communities’ as an explanation for scientific and technological progress (Kuhn, 1962), over the past half century this body of thought has developed under the general heading of Science and Technology Studies (STS).

Perceived by some as a threat to the authority and survival of ‘Normal Science’ (Stove, 1982; Theocharis and Psimopoulos, 1987), STS, with its inclusion of non-human as well as human ‘actors’ within its field of view, arguably draws a more nuanced picture of the social construction of science and technology than previous models, and is potentially more suited to the study of the Web.  As a system made up of “decentralised information structures … [and ] informal and unplanned informational links between people, agents, databases, organisations and other actors and resources” (Berners-Lee et al., 2006) analysis of the Web needs to reflect this structure, and employ and develop relevant research methodologies . 

The question then arises that if ‘Normal Science’ explanations are not fit for purpose, what should be the proper approach to doing Web Science?

The processes involved in undertaking scientific enquiry are based on methodologies that set parameters for measurement, analysis, evaluation, and iteration. An essential outcome of this activity is the communication of results. Because of the nature of funding, most scientists are ultimately called to account on their ability to provide effective evidence to substantiate their claims (Theocharis & Psimopoulos, 1987: 598). Web scientists must not only convince their peers in their own and other disciplines of their competence, the efficacy of their methods, and of the explanatory or predictive power of their conclusions (Prelli, 1990: 89-90), but also non-experts, whether they are in communities, governments or corporations. Essentially, in a world where positivist thought continues to maintain dominance, web scientists need to adopt methodologies that are accepted as effective, and modes of communication that are persuasive. In this context STS, the study of “… the myriad, daily negotiations among human and non-humans that make up the consensus called technology” (Haraway, 1997) has much to offer practitioners of Web Science. 

Kuhn’s recognition that science and technology are essentially socially constructed activities, no different from any other work, had a huge impact on the study of science, technology and society that followed in its wake. The assertion that groups of scientists and by extension, technologists, share common methodologies, modes of communication and interpretations of their work which are constructed in social settings in ways that do not accord with straightforward positivist interpretations, continues to be explored by social scientists and anthropologists. 

The common thread of STS is that “technologies … gain sense and significance within everyday activities and ordinary experience’ (Heath, Luff and Svensson, 2003: 77) and various methodologies have and are being developed to explain and deepen our understanding of how science and technology are socially constructed. A significant contribution to these evolving methodologies is the ‘Strong Programme’ of the Sociology of Scientific Knowledge (SSK). Bloor’s ‘four tenets’ establishes a clear research framework. Methodologies should be:

  • Causal - concerned with conditions that bring about beliefs or states of knowledge
  • Impartial - with respect to truth and falsity, rationality or irrationality, success or failure. All sides require explanation.
  • Symmetrical in the style of explanation.
  • Reflexive - applicable to sociology itself. (Bloor, 1991 [1995]: 5).

With its agnosticism toward scientific truths and methodological symmetry, this approach focuses on the work as it is performed, and has an open, naturalistic attitude to science and technological knowledge. It applies the concept of ‘finitism’, in that social forces affect interpretations and rules are adapted when applied to new cases.  

Criticisms of the Strong Programme have centred on the tendency of practitioners to overlook the changeable nature of society and to simplify the interests of participants as well as areas of conflict. This leads to problems in demonstrating causal links between beliefs and membership of social groups.

The link between technology, beliefs and membership of social groups is highlighted by explanations of how powerful interests may benefit in the social construction of technology. Langdon Winner asserts that by the adoption of some technologies people are unconsciously coerced into actions that may be against their interests. However while power can be exercised in the “design and arrangement of a device or system” to the potential benefit of individuals who align themselves with powerful institutions, it is by no means given that these technologies have “intractable properties” (Winner, 1986). As Pinch and Bijker convincingly argue in their description of the development of the bicycle, as technologies are brought into the field of practice, users exercise ‘interpretive flexibility’ (Pinch and Bijker 1989). That is, science and technology is essentially a rhetorical operation where inventors design artifacts to solve particular problems with specific uses in mind, but users adapt and modify them to fit many and various unforeseen circumstances. 

In terms of web-based technologies for learning, ideas generated within Social Construction of Technology (SCOT) closely align with the concept of affordances. This concept has been adopted by educational studies from Gestalt psychology to describe characteristics of the learning process (Laurillard et al, 2000) often attributed to learning technologies. The basic affordances of an artefact are “fundamental properties that determine just how the thing could possibly be used” (Norman, 1985) which are “usually perceivable directly, without an excessive amount of learning” (Gibson, 1979). For example, research into students use of lecture video recordings has shown that they do not watch entire lectures (as may have been predicted by developers), but fast forward through the material to find content of particular interest to them (Gorissen, Van Bruggen and Jochems, 2011).

The adaptation of technologies to meet various specific needs also opens to question the position that scientists and technologists can be studied as discrete and identifiable communities with shared methodologies and ontologies. Haraway’s exploration of disputes within feminism in the eighties (Haraway, 1991) indicates that categorisation of communities is problematic.

In addition Haraway’s re-evaluation of human-machine symbiosis personified by the ‘cyborg’, a creature often depicted as a threatening in science fiction, represented the construct as an empowering figure. The cyborg concept is revisited twenty years later in the context of the Web as an expression of ‘post-human’ entities brought about by the wide adoption of web technology (Hayles, 2006). Hayles re-imagines the cyborg web as the ‘cognisphere’ - a non-human, or disembodied, network that does not replace the human body but extends it through incorporation into human life practices. Non-human networked and programmable media become ‘cultural cognitions’ as they impact on human sensory-motor functions, cognitive processing, and wider political and economic activity. The individual is no longer an appropriate unit of analysis as these ‘cultural cognitions’ are embodied both in people and their technologies.

In direct response to concerns about the prevalence of social determinism in STS, Actor Network Theory (ANT) replaces artefacts and social relations with “chains which are associations of humans … and non-humans” (Latour, 1991: 110). Latour acknowledges that the processes of science and technology are alike and coins the term ‘technoscience’ to encapsulate both (Latour, 1987: 19) and in a similar manner to the Strong Programme, ANT employs methodological symmetry and makes no hierarchical distinction between the human and non-human. 

Actor Network Theory employs three key concepts to describe and analyse the 'co-evolution' of technoscience and society: 

  • Actor worlds: defines the identities, histories, sizes, theories, and roles that unite the diverse entities (human and non-human) involved in specific areas of study.
  • Translation: “To translate is to speak for, to be indispensable, and to displace.” The researcher ‘delineates the scenario’, ‘problematises’ actions and sets out the terrain that the ‘actor-world’ inhabits. To reach a ‘stable construction’ a process of displacement takes place when the entities under analysis are written up within the context of their physical and social environment.
  • Actor networks: The description of the dynamics and internal structure of actor-worlds which emphasises that the structure is “susceptible to change” (Callon, 1986).

In his analysis of the development of an electric car in France in the 1970s, Callon describes the role of electronic fuel cells within the actor network as a “black box whose operation has been reduced to a few well-defined parameters, gives way to a swarm of new actors: scientists and engineers who claim to hold the key to its functioning”. In this analytical process controversies are divided into a series of other elements as a watch is "dismantled by a jeweller to find out what is wrong" (Callon, 1986: 30).

In the study of the Web, ANT methodology can be used to unpick and analyse the actor world within which heterogeneous entities (individual users, researchers, web protocols, network infrastructure, the National Grid, news media, Mark Zuckerberg etc.) interact on an equal footing with (for example) a social network platform. The object of study is not the actors themselves, but the phenomena which is expressed through the interplay of these components (Barad, 2003).

Latour asserts that with the increasing availability of digital techniques and tools which allow “the tracing and visualization of …social phenomenon” and as digital profiles are changing the definition of what it means to be an individual, it may be more productive to focus on this ‘one level standpoint’, in contrast to exploring how individual decisions impact of social constructions (the ‘two-level standpoint’). Latour’s hypothesis is that significant, deeply entrenched social phenomena may be fruitfully investigated, analysed and evaluated through studying the ‘performance’ of new data mining and visualisation techniques.  “Web 2.0...has turned [one-level standpoint] navigation into a mainstream experience which might be captured in a sentence: the more you wish to pinpoint an actor, the more you have to deploy its actor-network.” (Latour et al., 2012: 591).

Criticism of theories that foreground social construction of technology focus on the lack of utility awareness this understanding brings to those actively involved in the experience (Hacking, 1999: 2), however, STS provides a diverse, eclectic and in-depth range of approaches to the study of the co-evolution of science, technology and society (e.g. the heterogeneity of actors under scrutiny, the equal treatment of human and non-human entities and methodological agnosticism) and indicates the efficacy of rejecting a ‘fixed theory of rationality’ and applying an ‘anything goes’, mixed approach when developing methodologies for Web Science (Feyerabend, 1975: 28). As Haraway asserts: 
‘The point is to make a difference in the world, to cast our lot for some ways of life and not others. To do that one must be in the action, be finite and dirty, not transcendent and clean.’ (Haraway, 1997: 36).

In education, as in other fields, it is acknowledged that technology provides "the means through which individuals engage and manipulate both resources and their own ideas" (Hannafin, Land, and Oliver, 1999: 128) but also that “...new technology easily supports a fragmented, informational view of knowledge…and is in danger of promulgating only that.” (Laurillard, 2002: 227). In this environment policy makers require guidance on the potential impact of web technologies (e.g. Jisc, 2013 and New Media Consortium, 2013) which enable educators to develop interventions that orchestrate and scaffold learning. This strongly indicates the necessity for using research and evaluation methods that explore the full range of potential affordances and constraints of web technologies and provide predictive tools that facilitate reliable indicators of future challenges and opportunities.

Therefore a mixed approach that enables researchers to choose appropriate methods, whether it is the randomised trials of the positivist tradition, other quantitative and qualitative methods, or pragmatic evaluation processes is required. In the second half of the last century there have been calls to democratise science and technology (Feyerabend, 1999: 224), and the inclusion of public participation has been shown to “increase[s] the quality and relevance of the research” (Staley et al., 2012). In recent years the validity of employing “multiple evaluators” in heuristic evaluation (Nielson, 2009), user involvement in systematic literature reviews (EPPI Centre, 2013), and public involvement in the research and evaluation of health technologies (NIHR, 2013) indicates increased awareness of the usefulness of non-expert involvement in research practices. This may take the form of ‘crowdsourcing’, for example data collection using Mechanical Turk initiatives (e.g Saunders, Bex and Woods, 2013), ‘citizen science’ interventions (e.g. Crowston and Prestopnik, 2013), online surveys (e.g. De Vera et al., 2010), or the formal engagement of lay panels in the assessment of research options (e.g. Boote et al., 2012).

In 2007 environmental scientist Mike Hulne contributed an article for The Guardian about the role of STS in the study of the changing global climate: 
All of us alive today have a stake in the future, and so we should all play a role in generating sufficient, inclusive and imposing knowledge about the future. Climate change is too important to be left to scientists - least of all the normal ones. (Hulme, 2007).
This sentiment applies equally to the study of the Web. In order to develop effective research programmes that disentangle the complex relationships between people and technology, that facilitate better understandings of the impact of changes brought about by our interaction with the Web, and improve the ability to predict the effect of Web-based activity, scientists in the field, informed by STS theories, need to employ wide-ranging, diverse and relevant methodologies. 


Barad, K., 2003. Posthumanist performativity: towards an understanding of how matter comes to matter. In Signs, 28(3) pp.801-31.

Berners-Lee, T., Hall, W., Hendler, J.A., O’Hara, K. , Shadbolt, N.  and Weitzner, D.J., 2006. A Framework for Web Science. In Foundations and Trends in Web Science Vol. 1, No 1 (2006) 1–130 [Online] Available at http://eprints.soton.ac.uk/263347/1/1800000001%5B1%5D.pdf (Accessed 1 December 2013).

Bloor D., 1991. The strong programme. In Knowledge and Social Imagery. Ch 1 [Online] Available at http://faculty.washington.edu/lynnhank/Bloor.pdf (Accessed 1 December 2013).

Boote, J. D., Dalgleish, M., Freeman, J., Jones, Z., Miles, M., and Rodgers, H., 2012. ‘But is it a question worth asking?’A reflective case study describing how public involvement can lead to researchers’ ideas being abandoned. In Health Expectations. [Online] Available at http://onlinelibrary.wiley.com/doi/10.1111/j.1369-7625.2012.00771.x/full (Accessed 9 December 2012).

Callon, M., 1986. The Sociology of an Actor-Network: the Case of the Electric Vehicle. In M. Callon, J. Law and A. Rip (Eds.) Mapping the Dynamics of Science and Technology: Sociology of Science in the Real World. London, Macmillan: 19-34. [Online] Available at http://www.homepages.ucl.ac.uk/~ucessjb/S3%20Reading/callon%201986.pdf (Accessed 29 November 2013).

Crowston, K. and Prestopnik, N.R., 2013. Motivation and Data Quality in a Citizen Science Game: A Design Science Evaluation. In System Sciences (HICSS), 2013 46th Hawaii International Conference on , vol., no., pp.450,459. [Online] Available at http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6479888&isnumber=6479821 (Accessed 11 December 2013).

De Vera, M. A., Ratzlaff, C., Doerfling, P., & Kopec, J., 2010. Reliability and validity of an internet-based questionnaire measuring lifetime physical activity. American Journal of Epidemiology, 172(10), 1190-1198.

EPPI Centre, 2013. Welcome to the EPPI Centre. [Online] Available at  http://eppi.ioe.ac.uk/cms/ (Accessed 9 December 2013).

Feyerabend, P., 1975. Against Method: Outline of an Anarchistic Theory of Knowledge. London: Verso.

Gibson, J. J., 1979. The Ecological Approach to Visual Perception, Boston: Houghton Mifflin Company.

Gorissen, P., Van Bruggen, J.M. and Jochems W., 2011. Analysing the use of recorded lectures by students. Short paper presented during the ALT-C conference in Leeds, UK [Online] Available at http://www.slideshare.net/PiAir/altc-2011-presentation (Accessed 25 November 2013).

Hacking, I., 1999. The social construction of what? Harvard University Press.

Halford, S., Pope, C. and Carr, L., 2010. A Manifesto for Web Science. In, Proceedings of the WebSci10: Extending the Frontiers of Society On-Line, Raleigh, US, 26 - 27 Apr 2010. , 1-6. [Online] Available at http://eprints.soton.ac.uk/271033/ (Accessed 5 November 2013).

Hannafin, M., Land, S., and Oliver, K., 1999. Open learning environments: Foundations, methods, and models. In C. M. Reigeluth (Ed.), Instructional design theories and models: A new paradigm of instructional theory (Vol. II, pp. 115-140). Mahwah, NJ: Lawrence Erlbaum Associates.

Haraway, D.,1991. A Cyborg Manifesto: Science, Technology and Socialist Feminism in the Late Twentieth Century. In D. Haraway (Ed.) Simians, Cyborgs and Women: the Reinvention of Nature. London, Free Association Books: 149-181.

Haraway, D. J., 1997. Modest-witness@ second-millennium. Femaleman [Copyright]-meets-oncomouse [Trademark]: Feminism and Technoscience. Psychology Press.

Hayles, N. K., 2006. Unfinished Work: From Cyborg to Cognisphere. In Theory, Culture & Society, December 2006 23: 159-166, [Online] Available at http://tcs.sagepub.com/content/23/7-8/159 (Accessed 5 December 2013).

Heath, C., Luff, P., and Svensson, M. S., 2003. Technology and medical practice. In Sociology of Health & illness, 25(3), 75-96. [Online] Available at http://onlinelibrary.wiley.com/doi/10.1111/1467-9566.00341/full (Accessed 7 November 2013).

Hulme, M., 2007. The appliance of science. In The Guardian. [Online] 14 March. Available at http://www.theguardian.com/society/2007/mar/14/scienceofclimatechange.climatechange (Accessed 25 November 2013).

Hume, D., 2007. An enquiry concerning human understanding. Oxford University Press. Originally published in 1748. [Online] Available at http://www.gutenberg.org/ebooks/9662 (Accessed 6 November 2013).

Jisc, 2013. Jisc Inform: Spotting emerging technologies. Issue 36. [Online] Available at http://www.jisc.ac.uk/inform/inform36/SpottingEmergingTechnologies.html#.UpHf5dK-2So (Accessed 20 November 2013).

Kuhn, T. S., 1962. Historical Structure of Scientific Discovery. In Science, 136 (3518), June 1, 1962, 760-764. [Online] Available at http://www.compilerpress.ca/Competitiveness/Anno/Anno%20Kuhn%20History%20of%20Discovery.htm (Accessed 7 November 2013).

Latour, B., 1987. Science in Action: How to Follow Scientists and Engineers Through Society. Cambridge MA: Harvard University Press.

Latour, B., Jensen, P., Venturini, T., Grauwin, S. and Boullier, D., 2012, ‘The whole is always smaller than its parts’ – a digital test of Gabriel Tardes' monads. The British Journal of Sociology, 63: 590–615. [Online] Available at http://onlinelibrary.wiley.com/doi/10.1111/j.1468-4446.2012.01428.x/full (Accessed 26 November 2013).

Latour, B., 1991. Technology is Society Made Durable. In Law J (Ed.) A Sociology of Monsters? Essays on Power, Technology and Domination, Sociological Review Monograph. London, Routledge: 103-131.

Laurillard, D., Stratford, M., Luckin, R., Plowman, L., and Taylor, J., 2000. Affordances for Learning in a Non-Linear Narrative Medium, in Journal of Interactive Media in Education, 2000 (2) [online]. Available at www-jime.open.ac.uk/00/2 (accessed 28 November 2013).

Laurillard, D., 2002. Rethinking university teaching: A conversational framework for the effective use of learning technologies. Psychology Press.

National Institute of Health Research, 2013. Evaluation, trials, studies. [Online] Available at http://www.nets.nihr.ac.uk/ (Accessed 10 December 2013).

New Media Consortium, 2013. Sparking innovation, learning and creativity. [Online] Available at http://www.nmc.org/ (Accessed 10 December 2013).

Nielson, J., 2009. How to Conduct a Heuristic Evaluation. [Online] Available at http://www.useit.com/papers/heuristic/heuristic_evaluation.html (Accessed 10 December 2013).

Norman D A (1988) The Design of Everyday Things, New York: Basic Books.

Pinch, T. J. and Bijker, W.E., 1989. The Social Construction of Facts and Artifacts: Or How the Sociology of Science and the Sociology of Technology Might Benefit Each Other. In Bijker WE, Hughes TP and Pinch TJ (eds.) The Social Construction of Technological Systems: New Directions in the Sociology and History of Technology. Cambridge, MA, MIT Press.

Popper, K., 1959. The logic of scientific discovery. London & New York:Routledge Classics. First published 1959 by Hutchinson & Co.
[Online] Available at http://goo.gl/7MW7PE (Accessed 10 November 2013). 

Prelli, L. J., 1990. Rhetorical logic and the integration of rhetoric and science. In Communication Monographs. Volume 57, Issue 4, 1990 pages 315-322 [Online] Available at http://dx.doi.org/10.1080/03637759009376206 (Accessed 22 November 2013).

Saunders, D. R., Bex, P. J., and Woods, R. L., 2013. Crowdsourcing a Normative Natural Language Dataset: A Comparison of Amazon Mechanical Turk and In-Lab Data Collection. In Journal of Medical Internet research, 15(5). [Online] Available at http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3668615/ (Accessed 10 December 2013).

Sismondo, S., 2010. An Introduction to Science and Technology Studies Oxford, Wiley-Blackwell.

Staley, K., Buckland, S. A., Hayes, H. and Tarpey, M., 2012. ‘The missing links’: understanding how context and mechanism influence the impact of public involvement in research. in Health Expectations. [Online] Available at onlinelibrary.wiley.com/doi/10.1111/hex.12017/abstract (Accessed 10 December 2013).

Stove, D. C., 1982. Popper and after : four modern irrationalists. Oxford : Pergamon Press.

Theocharis, T., and Psimopoulos, M., 1987. Where Science has Gone Wrong. In Nature, Vol 329, pp 595-598. [Online] Available at http://www.nature.com/nature/journal/v329/n6140/pdf/329595a0.pdf (Accessed 5 November 2013).