Skip to main content

Mapping Crisis: 4. Data colonialism, surveillance capitalism and drones

Mapping Crisis
4. Data colonialism, surveillance capitalism and drones
    • Notifications
    • Privacy
  • Project HomeMapping Crisis
  • Projects
  • Learn more about Manifold

Notes

Show the following:

  • Annotations
  • Resources
Search within:

Adjust appearance:

  • font
    Font style
  • color scheme
  • Margins
table of contents
  1. Cover
  2. Title Page
  3. Copyright Page
  4. Contents
  5. List of Illustrations
  6. Acknowledgements
  7. Notes on Contributors
  8. Mapping Crisis: a refl ection on the Covid-19 pandemic
  9. List of Abbreviations
  10. Introduction: mapping in times of crisis
  11. 1. Mapping as tacit representations of the colonial gaze
  12. 2. The failures of participatory mapping: a mediational perspective
  13. 3. Knowledge and spatial production between old and new representations: a conceptual and operative framework
  14. 4. Data colonialism, surveillance capitalism and drones
  15. 5. The role of data collection, mapping and analysis in the reproduction of refugeeness and migration discourses: reflections from the Refugee Spaces project1
  16. 6. Dying in the technosphere: an intersectional analysis of European migration maps
  17. 7. Now the totality maps us: mapping climate migration and surveilling movable borders in digital cartographies
  18. 8. The rise of the citizen data scientist
  19. 9. Modalities of united statelessness
  20. Index

4.Data colonialism, surveillance capitalism and drones

Faine Greenwood

The world is undergoing a process of ‘datafication’ (Mayer-Schönberger and Cukier, 2013) and the aid sector is no exception. Humanitarian and development workers are using newly available big data products to make operational decisions, learn more about crises and monitor the impact of ongoing response. This ‘humanitarian data ecosystem’ (Raymond and Al Achkar, 2016) encompasses many information sources, including social media postings, open-source maps, detailed records of mobile phone calls, as well as imagery taken from satellites and drones. Much of this information now comes from platforms or data sets controlled by private corporations, such as Facebook, Google and Twitter (Burns, 2018; Taylor and Broeders, 2015), and humanitarians increasingly partner with private corporations that have the data collection and analysis capacities they do not possess themselves (Fontainha et al., 2016).

Taken together, these developments can be described as a ‘technocratic turn’ in the humanitarian aid and development sector, as technical expertise grows in importance and impact (Read et al., 2016). At the same time, international debate has flared over the role that technology companies play in government and in our personal lives. Critics are developing new descriptions of the ‘data extraction’ (Zuboff, 2015, 2019) processes these companies rely upon to make profits. Couldry and Mejias (2018) and Thatcher, O’ Sullivan et al. (2016) refer to ‘data colonialism’, a term that describes a means of ‘capitalist accumulation by dispossession that colonizes and commodifies everyday life in ways previously impossible’ (Couldry and Mejias, 2018, p. 1). Zuboff (2019) describes the system of ‘surveillance capitalism’, in which human data by-products are converted into wealth, often at the expense of the producers of this raw material.

Small civilian drones are particularly representative objects of this technocratic turn in humanitarian aid, due to both their perceived novelty and to the ‘disease’ of surveillance capitalism with which they are often associated. The drone ‘allows us to project our intelligence into the air and to exert our influence over vast expanses of space’ (Wallace-Wells, 2014). Drones are sociotechnical ‘assemblages of the vertical’ (Crampton, 2016, p.1) that permit us to view the world from a novel perspective: below the high-altitude flight path of manned aircraft and satellites, but above the viewpoint of an earthbound human being (Garrett and Anderson, 2018). In the context of humanitarian aid and development, this new perspective is often hailed as one that permits humanitarians to do their jobs more effectively and to quite literally see need more clearly: it allows the circumstances they operate in to become more ‘legible’ to them (Scott, 1998). However, this new-found aerial perspective also brings with it novel risks to the people who are being surveilled, often without their explicit consent or knowledge. Importantly, these risks are by no means restricted to drones, which are simply one of many new data collection technologies used by humanitarian aid workers today. The ethical quandaries that they present are closely linked to those presented by the use of, for example, geolocated mobile phone data to track the spread of disease or the use of facial recognition technology to control refugee access to financial aid. This chapter will use civilian drone technology as an entry point to a broader consideration of data collection practices in the aid and development sector today, situating the technology within the dual frameworks of data colonialism and surveillance capitalism.

Big data, innovation and humanitarian aid

Humanitarian data collection has been present since very early in the history of the humanitarian movement. Read et al. (2016, p. 5) trace the ‘continuing fascination with speed and accuracy in information management, as well as fundamental ethical issues arising from data mining’ to the 1870s and J.C. Chenu’s and Florence Nightingale’s statistical analysis of casualties in the Crimean and Franco-Prussian Wars (L. McDonald, 2013). The so-called ‘humanitarian data revolution’ (Dickinson, 2016), however, is a more recent development, aligned with other data-driven technological advances outside the humanitarian sector. Many commentators (Meier, 2015; Read et al., 2016) trace the origin point or foundational moment of this new epoch of humanitarian data to 2010 and the response to the devastating Haiti earthquake. A new cadre of remotely distributed, spontaneous digital volunteers used map-making software, social media postings and other digital data sources to improve humanitarian situational awareness. These new networks of volunteers are often referred to as ‘digital humanitarians’ (Hunt and Specht, 2019; Meier, 2015). Social media postings (especially geotagged postings) ostensibly permit humanitarians to learn more about what is happening during a disaster, who needs help and where resources ought to be directed. Humanitarian satellite imagery analysts use high-resolution imagery to search for signs of impending famine, track population movements and monitor ongoing conflicts. These data collection efforts extend beyond the realm of immediate crisis response and the traditionally defined humanitarian sector and into the closely linked international development sector as well. The ‘digital traces’ left by mobile phones and social media provide information on population health, movement, public opinion and other matters that are not represented in often scant statistical data, providing development workers with a far greater range of quantifiable information than they had access to in the past (Mann and Ferenbok, 2013; Taylor and Schroeder, 2015). Burns (2015, p. 25) asserts that emergency management, disaster relief and international development are all ‘deeply interlocked regimes of knowledge, power, and morality’, with often amorphous boundaries between them. The interlocked nature of these sectors is reflected in their similar approaches to the use of ‘big data’ in their activities today.

Much of this data is not collected by humanitarian or development workers themselves, nor is it posted on platforms that they control. Instead, humanitarian data analysts often rely on data that come from somewhere else, including social media networks like Facebook, satellites operated by governmental or private organisations and mobile phone records that are released to humanitarians by telecom providers. Using external data sources costs less, involves less labour and is a less time-intensive prospect for humanitarians than self-collection, with companies sometimes donating the data to aid organisations themselves as an act of ‘data philanthropy’ (Jerven, 2013; Kirkpatrick, 2011). Supporters of these practices believe that using these data and the technologies that analyse them is ‘more accurate, faster, and more egalitarian’ (Read et al., 2016, p. 7) than the analogue humanitarian data collection practices of the past (Burns, 2018).

Read et al. (2016) argue that this technocratic turn in humanitarian aid was mediated by a number of factors, following the new corporate orientation that took root in aid organisations beginning in the 1980s and coupled with growing pressure on organisations from governments and donors to work more efficiently and transparently. These public–private partnerships embody the logic of what Currion (2018) describes as ‘market humanitarianism’, a system that combines the worst aspects of both hierarchy and market (Seybolt, 2009).

This neoliberal approach is echoed in the rhetoric surrounding new technologies for humanitarian aid, like small camera-carrying drones: technology that is ‘efficient, effective, and cheap’ will help save more lives and reduce the cost of humanitarian aid (Sandvik and Jumbert, 2016, p. 12). This orientation also incorporates and normalizes metrics stemming from the drive for ‘capital accumulations’ (Burns, 2019), emphasising cost saving, consumer choice and returns for aid donors who increasingly demand ever-more precise accountability in return for their financial investment. Digitised methods are increasingly presented as a means of achieving this level of granular accountability, permitting donors to more clearly see the ‘impact’ of their giving.

Today’s aid workers are often encouraged to learn from the ‘innovative’ technological expertise of private sector organisations (Mitchell, 2011). Humanitarian aid organisations are urged to hire data specialists and to increase the data literacy of existing staff. A blog post on the World Economic Forum (2018) on big data and humanitarian aid warns that ‘data is the new oil and, in order to maximise its extraction, there is a specific need for a skilled workforce’. Currion (2018) and Burns (2019) link this aid sector embrace of private sector tech expertise to the economic assumptions of neoliberalism, coupled with the growing sense that humanitarian aid itself functions as a sort of quasi-market. Currion (2018, p. 5) asks, ‘How better to succeed in this marketplace than to partner with organizations that have already succeeded in another marketplace?’

As a result, decision-makers in the aid sector were perhaps well-primed to welcome new partnerships with large corporations, especially those dependent on the extraction of data from individuals. Many of today’s largest and most widely known tech companies have announced recent partnerships with aid and development organisations. In 2017, Facebook’s Data for Good team launched its Disaster Maps Initiative, which uses Facebook data to search for areas where aid is most needed during a disaster. The company had partnered with 30 non-profit organisations and agencies by late 2018, including the United Nations International Children’s Emergency Fund (UNICEF), the International Federation of Red Cross and Red Crescent Societies and the World Food Programme (WFP) (Cheney, 2018). In September 2018, the UN, the World Bank, the International Committee of the Red Cross (ICRC), Microsoft Corp, Google and Amazon Web Services announced an ‘unprecedented global partnership to prevent future famines’, which will ‘use the predictive power of data to trigger funding through appropriate financing instruments, working closely with existing systems’ (World Bank, 2018). In February 2019, the WFP announced a five-year collaboration with data analysis company Palantir, to improve the efficiency of its aid delivery.

Drones in humanitarian aid: an overview

Drones are unmanned aerial vehicles (UAVs), usually equipped with some level of on-board computational power beyond that of traditional remote-controlled aircraft. They range in size from the military RQ-4 Global Hawk at 14,950 pounds (6.7 tonnes) to the Hubsan X4 Micro Quad Copter, which weighs less than an ounce (28 grams). They are built by at-home hobbyists using spare parts, by consumer technology companies and by military suppliers. This chapter focuses exclusively on small drones (under 55 pounds or 25kg) that are built by non-military contractors and are intended for civilian purposes, as these are the type that appear to be most widely used by humanitarian and development actors.

Drones are not a new technology. The Kettering Bug, the first functional UAV, was introduced in 1918 and billed as a ‘flying torpedo’ (Newcome, 2004; Stamp, 2013), while armed drones equipped with television technology were first used in warfare in 1944 (Lerner, 2017). As these military-specific drones developed, so too did hobby remote-controlled aircraft: the world’s first successful small remote-controlled aircraft was built in 1938 (Yarrish, 2011). By the 1980s, Japanese farmers had begun using unmanned helicopters to spray crops (Sheets, 2018). The development and success of smartphones bolstered demand for small, powerful sensors and processors and remote-controlled aircraft hobbyists began to incorporate these into their model designs, making their aircraft ‘smarter’. Hobby drone companies began to emerge in the 2000s, including Dajiang Enterprises (DJI), a Chinese company that by 2019 would become the world’s largest civilian UAV manufacturer. In 2013, DJI released its Phantom model, a relatively easy-to-use quadcopter-style drone equipped with global positioning system (GPS) and capable of carrying a small external camera.

Today’s civilian, consumer drones are capable of carrying high-resolution cameras, which can be used to collect video and still images. They are equipped with GPS receivers and can geotag the images they collect, which can then be processed into geographically accurate maps. Unlike the long-range drones used by the military, small multirotor drones can fly for approximately 30 minutes and have a maximum transmission range of 4.1 miles, meaning that operators are required to be relatively close to what they are attempting to surveil (DJI, 2019). As of 2019, small civilian UAVs have become a growing international market, while many nations have begun to introduce regulations that govern their use. At the moment, the so-called ‘big five’ technology companies (Apple, Facebook, Microsoft, Alphabet and Amazon) are largely not directly involved in the consumer drone market. However, drone-focused companies regularly use products that are produced by the ‘big five’ companies, such as Google Maps (for navigation), Apple iPhones and iPads (as a means of providing a visual interface for the drone), and Amazon Web Services (for data processing and storage).

How humanitarians use drones

Civilian drones have been used by disaster responders since at least 2000, when the Japanese government used the Yamaha R-Max unmanned helicopter to remotely evaluate volcanic activity in Hokkaido (Sato, 2003). Damage assessment has proved to be a particularly popular use for civilian drones in humanitarian contexts. In 2012, the International Organization for Migration (IOM) used drones to map damage from Hurricane Sandy in Haiti, while in 2013, drones were used to assess damage from Typhoon Haiyan in the Philippines. Drones were used for damage assessment in the aftermath of the 2015 Nepal earthquake, while in 2018, drones were used by disaster responders during the response to Hurricane Florence in the Eastern United States (Karpowicz, 2018) and to assess damage from the Sulawesi earthquake in Indonesia (DroneDeploy, 2018).

Some humanitarian organisations are also experimenting with civilian drones for purposes outside of damage assessment, such as package delivery and population monitoring. In 2018, the United Nations International Children’s Emergency Fund (UNICEF) and the government of the Republic of Kazakhstan announced the establishment of jointly run drone-testing corridors in that country (UNICEF, 2018), while the IOM regularly flew drones as part of needs and population monitoring efforts over refugee camps in Bangladesh (Humanitarian Data Exchange, 2019). The WFP has supported a number of humanitarian drone projects since 2015, including UAV cargo delivery experiments in the Dominican Republic (WFP Innovation Accelerator, 2019), UAV disaster assessment trainings in Mozambique (WFP Insight, 2019b) and emergency warning efforts in Bolivia (WFP Insight, 2019a).

Organisations dedicated to humanitarian and development drone use have also begun to emerge in response to the growing use of the technology. The Humanitarian UAV Network was founded in 2014, with the goal of serving as a central organisational point for humanitarian drone users. The organisation also introduced the UAV Network Code of Conduct (2019), which represents one of the first attempts at enshrining best practices and ethical standards into humanitarian drone use. The WeRobotics non-profit, founded in 2015, focuses on using robotics (particularly aerial civilian drones) to further humanitarian and development goals; its projects include both drone data collection and drone delivery efforts.

Drone use by humanitarian and development workers today can be broadly divided into two representative categories: data collection and the delivery of objects. Little data currently exist that attempts to quantify the particular drone models used by humanitarians, or the specific ways in which these drone models are used. Data collection uses are exceptionally heterogenous, due to the ever-growing variety of sensors that can be attached to consumer drone models. Organisations may in some cases use a single drone model to collect multiple types of data, from photographs and orthorectified maps to 4K videos and thermal photographs. While manned aircraft can take very high-resolution photographs, it is much more expensive to operate these aircraft and the photography equipment they carry than it is to operate a small drone (Greenwood and Kakaes, 2015). Purchasing satellite imagery is also very expensive, and the process of tasking a satellite and analysing those images often requires a highly specialised skill set (Radjawali et al., 2017).

Because today’s consumer drones are inexpensive and relatively easy to use, they have been adopted by many individuals and organisations who previously lacked access to aerial imagery. Notably, drone imagery has been used by a number of indigenous groups (Paneque-Gálvez et al., 2017; Radjawali et al., 2017) as a means of codifying their rights over traditional land, countering figurative (and literal) erasure of their presence by the ‘land-grabbing’ activities of neoliberal capitalists.

Unlike exclusively data-collecting drones (which are often made by consumer producers and less technically complex to operate), delivery drones are more often developed and deployed by private companies like Zipline, which operates medical delivery drones in Rwanda. The company evocatively describes its drones as ‘life-saving’ technology, developed as part of the company mission to ‘provide every human on Earth with instant access to vital medical supplies’ (Zipline, 2019).1

Data colonialism and surveillance capitalism

While critics of mass personal data collection have existed for as long as the practice and the companies that rely upon it have, these critiques have found new attention and consideration since around 2010, as the world grapples with unsettling realisations about the power of technology platforms to distort democracy, alter human relations and twist our societies into new and unwelcome forms (Manjoo, 2017). In the second decade of the twenty-first century scholars have defined and described new frameworks that describe these data collection practices and their particular effects on human society. This chapter, in its consideration of drone data collection in humanitarian aid, will rely upon two such frameworks: data colonialism and surveillance capitalism. Both will be summarised briefly in this section.

Couldry and Mejias (2018, p. 2) describe ‘data colonialism’ as a process that combines the ‘predatory extractive practices of historical colonialism with the abstract quantification methods of computing’. In their analysis, data are the new oil, which must be appropriated from the human actors that generate them via a process that they call ‘data relations’. This constant tracking of human behaviour brings about a phenomenon that they describe as ‘data colonialism’, a distinctively 21st-century manifestation of colonialism that normalises ‘the exploitation of human beings through data, just as historic colonialism appropriated territory and resources and ruled subjects for profit’ (Couldry and Mejias, 2018, p. 1). Key drivers of this new form of data colonialism include private companies that are heavily dependent upon the accumulation of individual data, such as social media companies like Facebook, mobile telecommunications companies like AT&T and advertising technology specialists such as Google.

The firms that profit from data colonialism, in this framework, view human social life as an open and ownerless source of raw data that is just there (Couldry and Mejias, 2018). For this data to be extracted for profitable purposes, however, Couldry and Mejias (2018, p. 3) contend that ‘life needs to be configured so as to generate such a resource’. Further, data from one individual in one moment need to be aggregated with other data from other times, permitting new conclusions to be drawn. Ultimately, the constantly watching, constantly tracking data practices perpetuated by these companies ‘invade the space of the self’ in their efforts to incorporate all of life into ‘an expanded process for the generation of surplus value’ (Couldry and Mejias, 2018, p. 8). Corporations attempt to normalise this process to the public and to regulators by likening the data that they rely upon to a natural resource that will be lost to humanity unless it is cleverly appropriated. Further, they argue that the data exhaust that humans emit cannot be owned by anyone (although it certainly can be used by anyone who has enough technical expertise to do so). Couldry and Mejias (2018) argue that these ideas resemble colonial arguments of the recent past that worked to justify the violent appropriation of terra nullis that clearly belonged to and was inhabited by indigenous people (Cohen, 2017).

Thatcher, O’Sullivan et al. (2016) describe the capture of big data related to individuals and to groups as an inherently asymmetric process. In their analysis, power asymmetry is integral to the process of data colonialism: the relations between the producers of data and the collectors and owners of data ‘mirror processes of primitive accumulation or accumulation by dispossession that occur as capitalism colonizes previously non-commodified, private times and places’ (Thatcher, O’Sullivan et al., 2016, p. 5; see also Harvey 2003, 2004).

The process of capital transforms data from a set of observations ‘into a multidimensional flow of algorithmically linked data points’, collected by various smart devices that transform human beings into ‘potential sensors’ (Thatcher, O’Sullivan et al., 2016, p. 5). Ultimately, Thatcher, O’Sullivan et al. conclude that: ‘If the processes by which big data commingle with everyday life are understood not as a “frontier” to be colonized, but as processes by which everyday life is colonized by “big money and big power”, then a new theoretical terrain for understanding big data is opened’ (Thatcher, O’Sullivan et al., 2016, p. 11, original emphasis).

The use of data colonialism as a metaphor for this type of large-scale, profit-driven data collection and use is by no means restricted to the aforementioned sources. A 2016 anonymously authored piece for Model View Culture (Anonymous, 2016) argued that the information and communications technology for development (ICT4D) sector continues the ‘legacy of colonialism within aid work’ via two trends: a ‘lack of ethical processes around data collection and management’ and ‘ongoing Western control over data’. The author concludes that it ‘seems clear that the lack of protections are used as another form of exploitation on the “global South” under the guise of aid, and the primary benefit is not intended for the project participants’.

In a 2018 interview with the Internet Health Report (Mozilla, 2018), Renata Avil, a Guatemalan senior digital rights adviser at the World Wide Web Foundation, defined ‘digital colonialism’ as ‘the new deployment of a quasi-imperial power over a vast number of people, without their explicit consent, manifested in rules, designs, languages, cultures and belief systems by a vastly dominant power’. Information researcher Michael Kwet (2019) describes digital colonialism as a ‘crisis’ that is ‘wreaking havoc on the global South’ as its practitioners consolidate power (in the form of data) and impose Silicon Valley’s ‘extraterritorial governance’ around the world. He stresses that it is ‘time to talk about Silicon Valley as an imperial force’, as a precursor to making the difficult changes required to counter its influence.

Related to data colonialism is the concept of ‘surveillance capitalism’, defined by technology scholar Shoshana Zuboff (2015, p. 75) as ‘an emergent logic of accumulation in the networked sphere’ that is dependent on a ‘global architecture of computer mediation’. The framework of surveillance capitalism is similar to that of ‘data colonialism’ in its criticism of new means of technology-driven profit creation that are dependent upon the extraction of data from human sources. It represents a second useful framework for considering the collection of data by humanitarian and development workers with drones.

Zuboff (2019, p. 8) notes that surveillance capitalism is not a technology in and of itself, but is instead a ‘logic that imbues technology and commands it into action’ that ‘unilaterally claims human experience as free raw material for translation into behavioural data’. The resulting products are ‘about predicting us, without actually caring what we do or what is done to us’ (Zuboff, 2019, p. 70). Under this logic – which shares some attributes with the concept of data colonialism elucidated by Couldry and Mejias (2018) – the practitioners of surveillance capitalism are highly motivated to find new sources of raw material from which they may extract behavioural surplus data, encompassing every aspect of human experience that is mediated through technology, including our voices, faces and elemental details of our likes and dislikes (Zuboff, 2019, p. 70). Further, Zuboff (2019, p. 8) asserts that surveillance capitalists are attempting to intervene in our lives in an effort to ‘nudge, coax, tune, and herd behaviour toward profitable outcomes’, leading to a world in which technology and technology companies work not just to ‘know our behaviour, but also to shape our behaviour at scale’. According to Zuboff (2019), the ultimate goal is to automate us, by subordinating the means of production to a means of behavioural modification, leading ultimately to the creation of a new type of power that she calls ‘instrumentarianism’. This power manipulates human behaviour to suit the needs of the corporate powers that profit from it, rather than to fit our own needs. It exerts itself through the ‘automated medium of smart networked devices, things, and spaces’ and it is becoming ever-more difficult to meaningfully escape from its influence (Zuboff, 2019, p. 8).

Echoing language that is often applied to settler colonialists, Zuboff (2019, p. 9) describes Google as a pioneer of surveillance capitalism, an organisation that has launched an ‘unprecedented market operation into the unmapped spaces of the Internet, where it faced few impediments from law or competitors, like an invasive species in a landscape free of natural predators’. According to Zuboff (2015, p. 78), the companies she is critiquing are engaged in ‘incursion into undefended private territory until resistance is encountered’.

Zuboff (2019, p. 10) argues that surveillance capitalist companies do not view the public as their customers, because their true customers are ‘the enterprises that trade in its markets for future behavior’. Nor do they view the public as potential employees, because the technology companies of today generally employ far fewer people than the most profitable corporations of the past. Instead, Zuboff (2019) suggests that Google and similar large corporations view the public as a source of raw material in a fashion that evokes the colonial idea of the exploitable and easily manipulated ‘Other’ (Eves, 1996). Colonists were not accountable to the people they profited from, nor are practitioners of surveillance capitalism accountable to the people whose data they extract: the surveillance capitalist company simply takes what it wants (Zuboff, 2015). And just as the colonialists of the past loudly asserted their moral right to extract raw material from their colonies, so do today’s surveillance capitalists ‘assert their right to invade at will’, using arguments based around self-determination, Darwinian survival and the supposed inherent value of innovation to normalise their ‘digital dispossession’ of humanity (Zuboff, 2019, pp. 24, 100). Ultimately, Zuboff (2019, p. 100) argues that we ‘are the native peoples now whose tacit claims to self-determination have vanished from the maps of our own experience’, rendering us similar to the indigenous inhabitants of terra nullis described by Cohen (2017, p. 213).

Humanitarian aid, drones and data colonialism

The international humanitarian aid and development sectors are regularly criticised for their perceived entanglement with colonial and neocolonial systems and attitudes. According to these critics, aid and development professionals may replicate old asymmetries of power: they may regard the people that they are attempting to help as ‘backward’ (Olivius, 2015), fail to consider the needs or perceptions of aid recipients (Dijkzeul and Wakenge, 2010) and impose neocolonial economic and social patterns upon the countries they work in (Langan, 2018). Critics of digital humanitarianism have also begun to question the optimistic narrative that accompanied its introduction, and the power that data holders now have over the development and aid sector (Taylor and Broeders, 2015). Aid organisations hold much more information about the people they assist than the latter do about them, in a pattern similar to the asymmetries in knowledge and power between the average person and the data aggregation companies of today (Zuboff, 2019). Aid organisations reliant upon the humanitarian imperative to push through technical innovation may use this motivation to weaken critical voices about the impact of these technologies on data privacy and security (Hosein and Nyst, 2013; S. McDonald, 2016).

Duffield (2016, p. 148) criticises the ‘hubris and technological determinism’ that has come with the ascendance of digital technology in humanitarian aid work in the global South. He asks if this new digital connectivity is helping to ‘reproduce stagnation, inequality and external control rather than ameliorate such conditions’. In their ethnographic study of the response to Typhoon Haiyan in the Philippines, Madianou et al. (2016) investigated the effectiveness of mobile phone-based ‘accountability to affected people’ initiatives. They found that these data were largely not ‘fed back’ to the communities actually impacted by the typhoon, but were instead sent to donors as evidence of ‘impact’, creating an illusion of accountability that was not evidenced by actual results. They concluded that ‘rather than improving accountability to affected people, digitized feedback mechanisms sustained humanitarianism’s power asymmetries’ (Madianou et al., 2016, p. 960).

Read et al. (2016, p. 11) also take note of this apparent one-way interchange of data between local communities and the organisations that purport to help them:

Although cloaked in the language of empowerment, data technology may be based on an ersatz participative logic in which local communities feed data into the machine (either through crowd sourcing, or by being enumerators or subjects in most traditional surveys) but have little leverage on the design or deployment of the technology.

The above critiques are applicable to a wide range of technologies and processes, many of which are used by digital humanitarians today. They are by no means specific to civilian drones, but encompass a broader set of ethical and operational concerns and fears. Drones do, however, offer us a specific entry point into a broader debate over the place of data collection technology in the aid and development sector and it is to the specific matter of spatial data that I now turn.

While the positive aspects of spatial data collection are widely understood and discussed in the humanitarian and development sector today, considerably less attention has been given to its downsides and to the historical and ethical implications of these data collection practices. Colonists have long relied on cartography as a means of cementing territorial control via the collection of spatial information and knowledge (Hunt and Specht, 2019; Kirsch, 2016; Sletto, 2011). Maps do not inevitably encode colonial power, but they are also not necessarily objective or inherently neutral. Instead, they make possible new modes of exploitation, such as the processes described as data colonialism and surveillance capitalism.

Proponents of open-source mapping projects often argue that their efforts help people by making them more ‘legible’ (Scott, 1998). To be ‘put on the map’ is a way to gain societal legitimacy and access to important services and legal protections. But being made more legible is not always desirable. Critics are increasingly challenging the narrative that maps (like those produced with drone imagery) are inherently tools of empowerment. They note that these map-making efforts ‘frame recognition in terms of titling, demarcation and legal reform, sidestepping more complex political questions about how indigenous claims have been shaped by processes of colonialism, dispossession and inequality’ and force people to conform to externally imposed notions of property ownership (Bryan, 2011, p. 49).

Taylor and Broeders (2015) observe that the practice of ‘reading like a state’ outlined by Scott (1998) in his description of legibility has been altered in the big data era. They suggest that remote data analytics often observe people unaware of this observation who cannot meaningfully consent to the collection of their data even if they wanted to do so (Taylor and Broeders, 2015). Further, they draw a distinction between Scott’s notion of legibility and the data-driven visibility of today, noting that people are made visible due to the huge volume of observed data about them that can be collected by governments and private corporations and that therefore people who are more connected are more visible than others (Taylor and Broeders, 2015). Visibility, more so than legibility, offers the power to ‘influence and intervene to a wider, more distributed set of actors: the corporations who gather and analyse the data, plus whoever they choose to share it with (or can capture it through other means), who may be state actors, international development institutions, or other corporate partners’ (Taylor and Broeders, 2015, p. 230).

This concept is illustrated clearly in Facebook’s recent effort to create a data map of the human population by applying computer vision techniques to match satellite imagery with government census information. This data map covers 23 countries and Facebook intends to use this information to determine how best to deliver internet services via satellites and drones to people living in different geographic areas (Shinal, 2017). The Facebook Connectivity Lab that has pioneered this programme claims its mission is to ‘connect the unconnected and underserved in the world’ (Gros and Tiecke, 2016). However, the Free Basics connectivity that the company offers does not give users access to the open web, instead mediating their connectivity through Facebook’s application. Facebook appears to be carrying out this project under the assumption that connectivity and visibility are inevitably superior to the alternative. This behaviour could credibly be read as not dissimilar to that of colonising powers who worked to impose ‘modern’ (and implicitly superior) ways of life and being upon indigenous people. Constine (2018) suggests that Facebook and its advocates appear to ‘believe that some internet is better than none for those who wouldn’t otherwise be able to afford it’, even if that connectivity comes at the price of making new users visible to (and thus exploitable by) the practitioners of surveillance capitalism.

People who reside in less-connected and less-developed parts of the world increasingly find themselves forced to choose between visibility and invisibility, and the introduction of drone technology – whether it is used to make maps or to deliver connectivity to social media applications – is part of this choice. Participatory mapping can therefore present ‘an impossible choice; one in which participants encounter the dilemma of needing to shed or set aside notions of how territory has been historically contested and negotiated in order to secure legal recognition of their rights in a hoped-for future’ (Bryan, 2011, p. 46). In an increasingly data-mediated world, people who refuse to make maps themselves may still be mapped by corporations like Facebook or Google, or by resource-extraction companies that are ‘commodifying and transforming space at an unprecedented scale’ (Radjawali et al., 2017, p. 818). Further, they lack ‘clear exit rights from the effects of heavily deployed technologies’ (Fox et al., 2006, p. 100). Without control over their spatial data, Fox (2002) argues that disempowered people will be no better off than if they had not been mapped at all.

Data colonialism and surveillance capitalism remove this control from everyone who is encompassed in their networks. For example, Radjawali and Pye (2015) cite a case where a small Indonesian village was literally erased from government censuses, as the satellite imagery that the census relied upon was too low resolution for the settlement to be visible. Ironically, the problems created by these ubiquitous technologies may only be able to be addressed by the technology itself. The solution to failures of humanitarian technology is thus forced to come from the application of more technology (Hershock, 1999; Tenner, 1996). Map-makers from indigenous communities are aware of these dynamics. By using drones, they can fight against systems where ‘access to maps and spatial information is limited and commodified’ (Paneque-Gálvez et al., 2017; Radjawali and Pye, 2015, p. 1). As participants in an indigenous mapping workshop concluded, ‘the more we map, the more likely it is that we will have no choice but to map’ (Fox et al., 2006, p. 105).

Drone data, like all forms of digitally mediated data, cannot be ‘uncollected’. Data that are collected for one purpose can be appropriated and used in unintended and unexpected ways. The rise of digital humanitarianism has been accompanied by a spread of open-data principles from the world of technology to the aid sector, which place importance on making data as publicly available and easily accessible as possible (Principles for Digital Development, 2019). Humanitarian data collectors who subscribe to these principles must therefore make choices about which data are too sensitive to share publicly and which data can be made freely available on sources designated for this purpose (such as the Humanitarian Data Exchange or drone-specific platforms like OpenAerialMap2). At the time of writing, there is little clarity on how these sensitive decisions are being made by humanitarian organisations regarding drone data.

As of February 2019 it is now possible to download high-resolution drone maps, collected by IOM staff, of the Cox’s Bazaar refugee camp from the Humanitarian Data Exchange and from the OpenAerialMap platform (Humanitarian Data Exchange, 2019). The OpenAerialMap platform asks users to submit high-resolution drone data and aerial maps, which are then made publicly available. The platform does not appear to screen this data for sensitive imagery, nor are privacy or security risks mentioned explicitly in the website’s documentation. In 2018, Humanitarian OpenStreetMap Team (HOT) volunteers were encouraged to use this drone imagery of the camp to identify and map roads that cut through the area.

Most critiques of consumer drone data emphasise dangers to individuals, who may be photographed or videotaped by a stealthy drone without their knowledge or consent. Fewer critiques consider the dangers that might be presented to groups of people or individuals by drone imagery that does not portray any individuals at all, but rather their neighbourhoods, homes and other aspects of their physical context (Taylor et al., 2017). Indeed, little research exists that directly links the public dissemination of high-resolution drone images and maps to their potentially harmful effects on people affected by disaster.

Researchers are increasingly aware of the problems posed by data aggregation (de Montjoye et al., 2015; Xu et al., 2017), by which disparate sources of data are combined to come up with new and often unexpected conclusions. These risks are by nature difficult to predict. It is also difficult to predict what non-humanitarian actors, such as corporations or government entities, might do with this data if they are permitted open access to them. Data colonialism is largely dependent upon the extraction and recombination of terra nullis data – like the data that Facebook collects – into new forms, forms that are often opaque or unintelligible to the people whose data have been appropriated. As humanitarian organisations work to better organise and consolidate the data they hold to improve their operational efficiency (as is the motive behind the WFP’s collaboration with Palantir, announced in the late 2010s), it appears inevitable that the risks posed by the open sharing of this data will grow. Humanitarians may also be motivated to consolidate and share these data to protect themselves, as part of their efforts to reduce on-the-ground risks to their staff (Hoelscher et al., 2017). Duffield (2016, p. 161) warns that these ‘security concerns are encouraging the convergence of the localised humanitarian, development, government and security databases into systems with a wide international reach’, largely unchecked by regulations or oversight.

This security-motivated drive to collect more data about more people operates in tandem with drives that motivate the private sector practitioners of surveillance capitalism and data colonialism. There are numerous instances of large internet companies sharing data they hold on their users with national governments, either voluntarily or under legal or political duress. Furthermore, the data that are held by either governments or by private sector companies may be stolen or otherwise revealed to the public. Humanitarians cannot assume that data shared with one actor will reliably stay with that actor.

Experimentation is another site of potential risk connected to humanitarian drone technology. The authors of a Google-published White Paper on the development of infrastructure that will permit company engineers to run ‘better, faster’ tests, write that: ‘At Google, experimentation is practically a mantra’ (Tang et al., 2010, p. 1). This experimental attitude is de rigueur among Silicon Valley companies – for example Facebook’s former motto to ‘move fast and break things’ (Vardi, 2018) – and this bleeds over into the humanitarian world’s technology efforts. In 2017, when the government of Malawi and UNICEF launched an air corridor for testing UAV uses, private companies were encouraged to apply to use the airspace for testing purposes (UNICEF, 2017). The drone delivery company Zipline, mentioned above, highlights this experimental approach in its work, as it tests its technology in Africa with the eventual goal of exporting it to the United States and Europe.

McDonald et al. (2017) asked at what point should humanitarian innovation be described as becoming ‘human subjects experimentation’, noting that this dividing line is poorly defined. Sandvik (2015, p. 75) observed that Africa is ‘the perfect recipient of good drone interventionism’, not only because the continent is construed as being eternally in need of externally imposed aid, but because of its (relative) inability to resist the rescue and investment efforts of outsiders, regardless of whether they target African territory or African airspace.

This experimentation is often accompanied by another assumption: that people who live in poorer parts of the world must know little about (and thus will be frightened by) civilian drones. In a 2016 survey of humanitarian actors carried out by Fondation suisse de déminage (FSD), 57 per cent of respondents felt that ‘local populations feel threatened by the use of drones’, although, as the survey authors note, these responses seem to be at odds with the highly limited available evidence regarding public perceptions of the technology (Soesilo and Sandvik, 2016).

These survey replies align with a common perception that people in developing countries are not as technically aware or technically competent as the humanitarians that are attempting to help them, an assumption that logically encompasses drone technology. If humanitarians operate under the (at this time largely unsupported) assumption that local people are frightened of or ignorant about drones, they may use the technology anyway, but in ways that exclude local people. They may fail to work with local partners who are knowledgeable about the technology and the context. As Tingitana and Kaiser (2018) note in a blog post for the WeRobotics drone non-governmental organisation (NGO): ‘Time and time again, we’ve seen large organizations in these sectors hire foreign drone companies to carry out aerial surveys that local drone pilots could do equally well and in a fraction of the time’. The use of drone technology by humanitarians can also, without adequate oversight or consideration of these issues, reinforce existing inequalities pertaining to the production of data and spatial data across gender, racial and class lines (Paneque-Gálvez et al., 2017; Radjawali et al., 2017).

It is also worth considering where the drones themselves come from. Consumer drones are widely available across the world and require less effort and skill to use than home-made models. It is reasonably safe to assume that time-strapped aid workers will largely rely upon their products. Dajiang Industries, the current market leader in the consumer drone sector (Skylogic Research, 2019), does not have a business model that is currently dependent upon the extraction of user data, but this does not mean its model will not shift in the future. There are a few competitors that produce similar products at the same price point as DJI does, creating the risk that aid users of the technology will be locked into their systems.

Initiatives are underway that seek to make drones a more standardised and legitimate part of countries’ national airspaces, including the creation of unique digital drone ‘licence plates’ and other means of remote identification (Moon, 2018). Some of these proposals may require that drone users pay a significant fee to operate in national airspace systems, or may centrally collect the data of drone users, integrating drones more thoroughly into the processes of data colonialism and surveillance capitalism. Some nations already require drone users to pay a large licensing fee (such as South Africa), while others, like Mexico, require groups that photogrammetrically process drone data to hold a government permit (Paneque-Gálvez et al., 2017). A world in which drones are only usable by the rich and well-connected (including international aid workers) will not be a more equitable one.

Addressing data colonialism in drone use

Digital humanitarianism is an inherently hopeful endeavour, a movement that perceives itself as harnessing the power of networked technology to help people around the world, breaking down technical, social and economic barriers. Spatial technologies, like drones, ‘hold incredible epistemological and tactical promise’ (Burns, 2018, p. 8.), they are tools that can conceivably be used to make the world a more equitable and ethical place. The processes of data colonialism and surveillance capitalism threaten to leave these hopes unrealised. Instead, they threaten to create a ‘surveillance humanitarianism’ that appropriates the ethical goals and considerations of the aid movement at the expense of both humanitarian aid workers and the people they wish to serve. These forces are powerful, but it is not too late to push back against them. Indeed, it is clear that data colonists recognise (and wish to be linked to) the moral authority that humanitarian aid still holds. By using this authority – in a number of differing, but interlocking ways – the aid and development sectors can push back against the march of digital dispossession.

The first step may be to recognise – and reject – digital colonisers’ arguments about the supposed necessity of their data extraction methods. Hosein and Nyst (2013, p. 58) write that ‘the choice between privacy and development creates a false dichotomy and spurs over-simplified arguments about the role of technology’. Similarly, Zuboff (2019) argues that technology companies have created a false dichotomy, where we are led to believe that their surveillance systems are inevitable and a required part of capitalism. She calls for mass resistance to these quantifying forces, which must of necessity be accompanied by political pressure beyond the abilities of individuals.

Couldry and Mejias (2018, p. 11) also describe a vision of resisting data colonialism that ‘rejects the idea that the continuous collection of data from human beings is natural, let alone rational; and so rejects the idea that the results of data processing are a naturally occurring form of social knowledge, rather than a commercially motivated form of extraction that advances particular economic and/or governance interests’. They previously noted that ‘a continuously trackable life is a dispossessed life, no matter how one looks at it. Recognizing this dispossession is the start of resistance to data colonialism’ (Couldry and Mejias, 2018, p. 10).

A humanitarian movement that recognises these forces should be empowered to assert its values. It should feel capable of turning down collaborations with or data-sharing agreements with data colonisers, but it must also be able to clearly explain why. Some humanitarian aid and development organisations are developing codes of conduct, handbooks and other compilations of best practices to serve as a ‘moral road map’ for their interactions with data-collecting technologies and technology companies. In the case of drones, the Humanitarian UAV Network Code of Conduct (2019) is the most comprehensive current attempt to develop a set of ethical standards for humanitarian drone use, covering data issues, local involvement, privacy protection and other categories. The International Committee of the Red Cross (ICRC) 2017 Handbook on Data Protection in Humanitarian Action (Kuner and Marelli, 2017) also specifically considers the problem of data protection with drones. Both standards call for humanitarian drone users to conduct both practical and ethical risk assessments prior to flight, to develop data-sharing techniques in advance of collecting the data itself, to engage local communities to the extent realistically possible in data collection efforts and to minimise the amount of data collected. However, these standards lack any real enforcement mechanism and it is unclear to what extent they are actually being adhered to today. More work is needed to ensure that they are both widely known and followed in the humanitarian aid and development sector. While these ethical codes are important, they are no panacea. It is clear that adherence to a certain set of rules or criteria is not enough to arrest the spread of data colonialism and surveillance capitalism. Powerful companies and their supporters are capable of influencing what these rules look like, while toothless codes of conduct that rely largely on internal policing merely put a legalistic fig leaf over extractive practice.

Little academic research exists that attempts to formally link human security and safety to the collection and dissemination of drone-collected data. There are few technical or formal guidelines that concern themselves with minimising these risks. For example, it is unclear what types of blurring or image redaction might reduce the risk of harm to affected populations from drone-collected data, what types of sensitive information are discernible from drone imagery collected at different altitudes and at different resolutions or how security risks from drone video differ from security risks from drone photographs. This is likely due to the relative novelty of the technology and its uses in humanitarian, disaster-response situations. Whatever the cause of these gaps, they must be filled if aid organisations intend to conduct meaningful risk assessments of the collection of drone data.

There is also very little research that attempts to determine how different groups of people in different cultures and in different geographic locations feel about and approach the use of drones and drone-collected imagery in humanitarian contexts. The vast majority of existing survey and opinion research related to drone technology and privacy has been carried out with US and European populations. This absence of research, unfortunately, extends beyond drone data. As Payal Arora (2016, p. 1694) writes, there ‘is a dearth of studies on how marginalized populations in the global South view, construct, and practice privacy’. Local, contextual knowledge is protective against processes of data colonialism and involving people more closely in data protection processes gives them a chance to maintain agency over information that is collected about them. In the absence of this information, humanitarian aid workers run an increased risk of using the technology in ways that reinforce data colonialism and surveillance capitalism: consider the strange absence of local people in the aspirational ‘technoscape of the Ebola drone’ described by Sandvik (2015, p. 11). Still, it is important to note that better research will do little to protect individuals if large aid organisations continue to engage in partnerships and data-sharing agreements with data colonists.

Localisation is one means of addressing these risks, in large part because it is a means of sharing power and influence over technical projects between more people. Andrew Schroeder (2018) of WeRobotics commented in a blog post that ‘if drones are going to fulfil their humanitarian potential, the structures, skills, and knowledge that guide them cannot depend on large international agencies or rapid importation at times of crisis’. One such practice is participatory mapping, which includes many different community members and stakeholders in the mapping process. Often this is done explicitly as a means of reducing the colonial ‘view from above’ implications of colonial mapping projects. It is not a panacea, data from such participatory projects can still be shared (and exploited) in unexpected ways (Specht and Feigenbaum, 2018). However, these efforts do give the people whose data are being collected a clear role and a voice in the collection and analysis of their information and more such work is needed. Paneque-Gálvez et al. (2017, p. 16), referring to Peruvian community mapping efforts that incorporate drone imagery, write that ‘we need to Amazonize drone technology so that it can become truly useful in such a socially and environmentally challenging context’. They call for the establishment of new drone schools that cater to the specific context of indigenous people, referring to the example of Irendra Radjawali’s Swandiri Institute (Radjawali and Pye, 2015) as an example of this model.

Humanitarian and development workers should be aware of the legal status of drone use in the areas that they work and how these legal changes may impact local users of the technology. Regulations ensuring that drones can only be effectively used by deep-pocketed and well-connected organisations remove the technology’s grassroots potential. More powerful organisations should work to assist smaller ones with pushing back against exclusionary lawmaking.

As part of supporting these localisation processes, humanitarian drone users should also ensure that they look beyond Eurocentric ideas about big data, data sharing and technology. Arora (2016) writes that today’s debates about the impact of big data on human society continue to be highly Western-centric. Moreover, while the West is in the midst of considerable societal debate about the negative impact of big data on its societies, discourse around ‘big data projects [in] the Global South have an overwhelmingly positive connotation’ (Arora, 2016, p. 1682). Arora (2016, p. 1694) further notes that we must ‘pay more attention to where the values in digital design emerge and who dictates these information infrastructures to create allowances for a richer databased identity’, adding that the global South ‘should be actively engaged with current debates – such as the right to be forgotten – as multinational IT companies confront national sentiments, values, and institutions, illustrating how context continues to matter’.

Milan and Treré (2018), meanwhile, in their exploration of ‘big data from the South’, ask us to consider how the processes of datafication that this chapter critiques might look ‘upside down’. They warn against the trap of ‘digital universalism’ that attempts to ‘gloss over differences and cultural specificities’ (Milan and Treré, 2018, p. 324) in its criticism of big data and the practices that accompany it. Drawing on work from Arora (2016) and Udupa (2015), they note that while the majority of the world’s population resides outside of the West, we ‘continue to frame key debates on democracy and surveillance – and the associated demands for alternative models and practices – by means of “Western” concerns, contexts, user behaviour patterns, and theories’ (Milan and Treré, 2018, p. 320). They urge us in our thinking to make ‘the move from datafication to data activism/data justice’, by examining the ‘diverse ways through which citizens and the organized civil society in the South engage in bottom-up data practices for social change and resist a datafication process that increases oppression and inequality’ (Milan and Treré, 2018, p. 328).

Ultimately, we can decolonise our use of drone technology in humanitarian aid. It is well within our power to decline to create a system of ‘surveillance humanitarianism’ or to facilitate its creation by others. We can realise the power of new technology to save lives and reduce human suffering without embracing the practitioners of data colonialism and surveillance capitalism. The humanitarian aid sector should stand as an essential moral voice against data colonisers. Indeed, its survival as an independent and effective movement depends on it.

1It should be noted that delivery drones can also be used to collect visual spatial data.

2See https://map.openaerialmap.org/#/-84.375,-5.528510525692789,3/square/21000?_k=8tjb1j.

References

Anonymous (2016) ‘Data colonialism: critiquing consent and control in “tech for social change”’, Model View Culture, 15 Nov., https://modelviewculture.com/pieces/data-colonialism-critiquing-consent-and-control-in-tech-for-social-change (accessed 28 April 2019).

Arora, P. (2016) ‘The bottom of the data pyramid: big data and the global South’, International Journal of Communication, 10: 1681–99.

Bryan, J. (2011) ‘Walking the line: participatory mapping, indigenous rights, and neoliberalism’, Geoforum, 42 (1): 40–50, https://doi.org/10.1016/j.geoforum.2010.09.001.

Burns, R. (2015) Digital Humanitarianism and the Geospatial Web: Emerging Modes of Mapping and the Transformation of Humanitarian Practices (Doctoral dissertation, University of Washington), http://hdl.handle.net/1773/33947 (accessed 28 April 2019).

— (2018) ‘Datafying disaster: institutional framings of data production following Superstorm Sandy’, Annals of the American Association of Geographers, 108 (2): 569–78, https://doi.org/10.1080/24694452.2017.1402673.

— (2019) ‘Let the private sector take care of this: the philanthro-capitalism of digital humanitarianism’, in M. Graham (ed.), Digital Economies at Global Margins (Cambridge, MA: MIT Press), pp. 129–52.

Cheney, C. (2018) ‘How Facebook has tripled its disaster maps partnerships’, Devex, 18 Dec., https://www.devex.com/news/how-facebook-has-tripled-its-disaster-maps-partnerships-93951 (accessed 28 April 2019).

Cohen, J. (2017) ‘The biopolitical public domain: the legal construction of the surveillance economy’, Philosophy & Technology, 31 (2): 213–33, https://doi.org/10.1007/s13347-017-0258-2.

Constine, J. (2018) ‘Facebook’s Internet.org has connected almost 100M to the “internet”’, Techcrunch, https://techcrunch.com/2018/04/25/internet-org-100-million (accessed 12 May 2020).

Couldry, N. and U. Mejias (2018) ‘Data colonialism: rethinking big data’s relation to the contemporary subject’, Television and New Media, 33 (4): 1–14, https://doi.org/10.1177/1527476418796632.

Crampton, J.W. (2016) ‘Assemblage of the vertical: commercial drones and algorithmic life’, Geographica Helvetica, 71: 137–46.

Currion, P. (2018) ‘Network humanitarianism’, Overseas Development Institute, https://www.odi.org/sites/odi.org.uk/files/resource-documents/12202.pdf (accessed 28 April 2019).

Dajiang Enterprises (DJI) (2019) ‘Phantom 4 Pro specs’, https://www.dji.com/phantom-4-pro/info (accessed 28 April 2019).

de Montjoye, Y., L. Radaelli, V. Singh and A. Pentland (2015) ‘Unique in the shopping mall: on the reidentifiability of credit card metadata’, Science, 347: 536–9, https://www.doi.org/10.1126/science.1256297.

Dickinson, E. (2016) ‘Is now the moment the humanitarian data revolution begins?’ Devex, 10 June, https://www.devex.com/news/is-now-the-moment-the-humanitarian-data-revolution-begins-88270 (accessed 28 April 2019).

Dijkzeul, D. and C.I. Wakenge (2010) ‘Doing good, but looking bad? Local perceptions of two humanitarian organisations in eastern Democratic Republic of the Congo’, Disasters, 34 (4): 1139–70, https://doi.org/10.1111/j.1467-7717.2010.01187.x.

DroneDeploy (2018) ‘Drones assess the aftermath of Indonesia’s destructive earthquake’, 13 Nov., https://blog.dronedeploy.com/drones-assess-the-aftermath-of-a-indonesias-destructive-earthquake-1e60611d0abd (accessed 28 April 2019).

Duffield, M. (2016) ‘The resilience of the ruins: towards a critique of digital humanitarianism resilience’, Resilience, 4 (3): 147–65, https://doi.org/10.1080/21693293.2016.1153772.

Eves, R. (1996) ‘Colonialism, corporeality, and character: Methodist missions and the refashioning of bodies in the Pacific’, History and Anthropology, 10 (1): 85–138, https://doi.org/10.1080/02757206.1996.9960893.

Fontainha, T., P. de Oliveira Melo and A. Leiras (2016) ‘The role of private stakeholders in disaster and humanitarian operations’, Journal of Operations and Supply Chain Management, 9 (1): 77, https://doi.org/10.12660/joscmv9n1p77-93.

Fox, J. (2002) ‘Siam mapped and mapping in Cambodia: boundaries, sovereignty, and indigenous conceptions of space’, Society and Natural Resources, 15 (1): 65–78.

Fox, J., K. Suryanata, P. Hershock and A. Pramono (2006) ‘Mapping power: ironic effects of spatial information technology’, Participatory Learning and Action, 54: 98–105, http://pubs.iied.org/pdfs/14507IIED.pdf#page=99 (accessed 1 December 2018).

Garrett, B. and K. Anderson (2018) ‘Drone methodologies: taking flight in human and physical geography’, Transactions of the Institute of British Geographers, 43 (3): 341–59, https://doi.org/10.1111/tran.12232.

Greenwood, F. and K. Kakaes (2015) ‘Drones and aerial observation’, New America, https://drones.newamerica.org/primer (accessed 1 December 2018).

Gros, A. and T. Tiecke (2016) ‘Connecting the world with better maps’, Facebook Research, 21 Feb., https://research.fb.com/connecting-the-world-with-better-maps (accessed 28 April 2019).

Harvey, D. (2003) ‘The right to the city’, International journal of urban and regional research, 27 (4): 939-41.

Harvey, D. (2004) ‘Class relations, social justice and the politics of difference’, in Place and the Politics of Identity (Oxford: Routledge), pp. 48-72.

Hershock, P.D. (1999) Reinventing the Wheel: A Buddhist Response to the Information Age (Albany: State University of New York Press).

Hoelscher, K., J. Miklian and H. Mokleiv Nygård (2017) ‘Conflict, peacekeeping, and humanitarian security: understanding violent attacks against aid workers’, Journal of Human Rights, 24 (4): 538–65.

Hosein, G. and C. Nyst (2013) Aiding Surveillance: An Exploration of How Development and Humanitarian Aid Initiatives Are Enabling Surveillance in Developing Countries (London: Privacy International).

Humanitarian Data Exchange (2019) ‘IOM Bangladesh – needs and population monitoring (NPM) Cox’s Bazar Rohingya refugees settlements UAV imagery’, https://data.humdata.org/dataset/iom-npm-cox-bazar-uav-imagery (accessed 28 April 2019).

Humanitarian UAV Code of Conduct (2019) https://uavcode.org (accessed 28 April 2019).

Hunt, A. and D. Specht (2019) ‘Crowdsourced mapping in crisis zones: collaboration, organisation and impact’, Journal of International Humanitarian Action, 4 (1), https://doi.org/10.1186/s41018-018-0048-1.

Jerven, M. (2013) Poor Numbers: How We Are Misled by African Development Statistics and What to Do About It (Ithaca, NY: Cornell University Press).

Karpowicz, J. (2018) ‘How did the NCDOT use drones to help plan for and recover from Hurricane Florence?’ Commercial UAV News, 12 Dec., https://www.expouav.com/news/latest/ncdot-drones-hurricane-florence (accessed 28 April 2019).

Kirkpatrick, R. (2011) ‘Data philanthropy is good for business’, Forbes, https://www.forbes.com/sites/oreillymedia/2011/09/20/data-philanthropy-is-good-for-business/#6e75c2345f70 (accessed 28 April 2019).

Kirsch, S. (2016) ‘Insular territories: US colonial science, geopolitics, and the (re)mapping of the Philippines’, Geographical Journal, 182 (1): 2–14, https://doi.org/10.1111/geoj.12072.

Kuner, C. and M. Marelli (2017) Handbook on Data Protection in Humanitarian Action (Geneva: International Committee of the Red Cross).

Kwet, M. (2019) ‘Digital colonialism is threatening the global South’, Al Jazeera, 13 March, https://www.aljazeera.com/indepth/opinion/digital-colonialism-threatening-global-south-190129140828809.html (accessed 28 April 2019).

Langan, M. (2018) Neo-Colonialism and the Poverty of ‘Development’ in Africa (Cham: Palgrave Macmillan).

Lerner, P. (2017) ‘The first drone strike – in 1944’, Air and Space, October, https://www.airspacemag.com/military-aviation/drone-strike-180964753 (accessed 28 April 2019).

Madianou, M., J.C. Ong, L. Longboan and J.S. Cornelio (2016) ‘The appearance of accountability: communication technologies and power asymmetries in humanitarian aid and disaster recovery’, Journal of Communication, 66 (6): 960–81, https://doi.org/10.1111/jcom.12258.

Manjoo, F. (2017) ‘How 2017 became a turning point for tech giants’, New York Times, 13 Dec., https://www.nytimes.com/2017/12/13/technology/tech-companies-social-responsibility.html (accessed 28 April 2019).

Mann, S. and J. Ferenbok (2013) ‘New media and the power politics of sousveillance in a surveillance-dominated world’, Surveillance & Society, 11 (1/2): 18.

Mayer-Schönberger, V. and K. Cukier (2013) Big Data: A Revolution That Will Transform How We Live, Work and Think (London: John Murray).

McDonald, L. (2013) ‘Florence Nightingale, statistics and the Crimean War’, Journal of the Royal Statistical Society: Series A (Statistics in Society), 177 (3): 569–86, https://doi.org/10.1111/rssa.12026.

McDonald, S. (2016) Ebola: A Big Data Disaster – Privacy, Property, and the Law of Disaster Experimentation (Bengaluru: Centre for Internet and Society).

McDonald, S., K. Sandvik and K. Jacobsen (2017) ‘From principle to practice: humanitarian innovation and experimentation’, Stanford Social Innovation Review (December), https://ssir.org/articles/entry/humanitarian_innovation_and_experimentation (accessed 28 April 2019).

Meier, P. (2015) Digital Humanitarians (Boca Raton, FL: CRC Press).

Milan, S. and E. Treré (2018) ‘Big data from the South(s): beyond data universalism’, Television & New Media, 20 (4): 319–35.

Mitchell, A. (2011) ‘Talk point: public and private sector partnership can improve aid and relief work’, Guardian, 19 Sept., https://www.theguardian.com/sustainable-business/blog/private-sector-humanitarian-emergencies-aid-relief (accessed 28 April 2019).

Moon, M. (2018) ‘Regulators want drones to have visible “license plates”’, Engadget, 24 May, https://www.engadget.com/2018/05/24/regulators-drones-license-plates (accessed 28 April 2019).

Mozilla (2018) ‘Resisting digital colonialism’, Internet Health Report, https://internethealthreport.org/2018/resisting-digital-colonialism (accessed 28 April 2019).

Newcome, L.R. (2004) Unmanned Aviation: A Brief History of Unmanned Aerial Vehicles (Reston, VA: American Institute of Aeronautics and Astronautics).

Olivius, E. (2015) ‘Constructing humanitarian selves and refugee others’, International Feminist Journal of Politics, 18 (2): 270–90, https://doi.org/10.1080/14616742.2015.1094245.

Paneque-Gálvez, J., N. Vargas-Ramirez, B.M. Napoletano and A. Cummings (2017) ‘Grassroots innovation using drones for indigenous mapping and monitoring’, Land, 6 (4): 86, https://doi.org/10.3390/land6040086.

Principles for Digital Development (2019) ‘Use open standards, open data, open source, and open innovation’, https://digitalprinciples.org/principle/use-open-standards-open-data-open-source-and-open-innovation (accessed 28 April 2019).

Radjawali, I. and O. Pye (2015) ‘Counter-mapping land grabs with community drones in Indonesia’, paper presented at the Land Grabbing, Conflict and Agrarian-Environmental Transformations: Perspectives from East and Southeast Asia Conference, 5–6 June, Chiang Mai University, Thailand.

Radjawali, I., O. Pye and M. Flitner (2017) ‘Recognition through reconnaissance? Using drones for counter-mapping in Indonesia’, Journal of Peasant Studies, 44 (4): 817–33, https://doi.org/10.1080/03066150.2016.1264937.

Raymond, N. and Z. Al Achkar (2016) ‘Building data responsibility into humanitarian action’, OCHA Policy and Studies Series, 18, https://docs.unocha.org/sites/dms/Documents/TB18_DataResponsibility_Online.pdf.

Read, R., B. Taithe and R. Mac Ginty (2016) ‘Data hubris? Humanitarian information systems and the mirage of technology’, Third World Quarterly, 37 (8): 1–18, https://doi.org/10.1080/01436597.2015.1136208.

Sandvik, K. (2015) ‘African drone stories’, Behemoth, 8 (2): 73–96, https://doi.org/10.6094/behemoth.2015.8.2.870.

Sandvik, S. and M. Jumbert (2016) ‘Introduction: what does it take to be good?’, in S. Sandvik and M. Jumbert (eds.), The Good Drone (Oxford: Routledge), pp. 1–25.

Sato, A. (2003) ‘The RMAX Helicopter UAV’, https://apps.dtic.mil/dtic/tr/fulltext/u2/a427393.pdf (accessed 28 April 2019).

Schroeder, A. (2018) ‘Localizing humanitarian drones: robotics and disaster response from the Maldives to Malawi’, Radiant Earth Insights, https://medium.com/radiant-earth-insights/localizing-humanitarian-drones-robotics-disaster-response-from-the-maldives-to-malawi-a1f362432cb1 (accessed 28 April 2019).

Scott, J.C. (1998) Seeing Like a State: How Certain Schemes to Improve the Human Condition Have Failed (New Haven, CT: Yale University Press).

Seybolt, T. (2009) ‘Harmonizing the humanitarian aid network: adaptive change in a complex system’, International Studies Quarterly, 53: 1027–50.

Sheets, K.D. (2018) ‘The Japanese impact on global drone policy and law: why a laggard United States and other nations should look to Japan in the context of drone usage’, Indiana Journal of Global Legal Studies, 25 (1): 513–37.

Shinal, J. (2017) ‘Facebook has mapped populations in 23 countries as it explores satellites to expand internet’, MSNBC, https://www.cnbc.com/2017/09/01/facebook-has-mapped-human-population-building-internet-in-space.html (accessed 28 April 2019).

Skylogic Research (2019) 2018 Drone Market Sector Report, 14 Jan., http://droneanalyst.com/research/research-studies/2018-drone-market-sector-report-purchase (accessed 28 April 2019).

Sletto, B. (2011) ‘Indigenous rights, insurgent cartographies, and the promise of participatory mapping’, Lilas Portal, 7: 12–15, https://repositories.lib.utexas.edu/handle/2152/62703.

Soesilo, D. and K. Sandvik (2016) ‘Drones in humanitarian action: a survey on perceptions and applications’, http://drones.fsd.ch (accessed 28 April 2019).

Specht, D. and A. Feigenbaum (2018) ‘From cartographic gaze to contestatory cartographies’, in P. Bargués, D. Chandler and E. Simon (eds.), Mapping and Politics in the Digital Age (London: Routledge), pp. 55–71.

Stamp, J. (2013) ‘Unmanned drones have been around since World War I’, Smithsonian, 12 Feb., https://www.smithsonianmag.com/arts-culture/unmanned-drones-have-been-around-since-world-war-i-16055939 (accessed 28 April 2019).

Tang, D., A. Agarwal, D. O’Brien and M. Meyer (2010) ‘Overlapping experiment infrastructure: more, better, faster experimentation’, in Proceedings of the 16th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (New York: ACM), pp. 17–26.

Taylor, L. and D. Broeders (2015) ‘In the name of development: power, profit and the datafication of the global South’, Geoforum, 64: 229–37, https://doi.org/10.1016/j.geoforum.2015.07.002.

Taylor, L., L. Floridi and B. van der Sloot (eds.) (2017) Group Privacy: New Challenges of Data Technologies (New York: Springer International).

Taylor, L. and R. Schroeder (2015) ‘Is bigger better? The emergence of big data as a tool for international development policy’, GeoJournal, 80 (4): 503–18, https://doi.org/10.1007/s10708-014-9603-5.

Tenner, E. (1996) Why Things Bite Back: Technology and the Revenge of Unintended Consequences (New York: Knopf).

Thatcher, J., D. O’Sullivan and Mahmoudi, D. (2016) ‘Data colonialism through accumulation by dispossession: new metaphors for daily data’, Environment and Planning D: Society and Space, 34 (6): 1–17, https://doi.org/10.1177/0263775816633195.

Tingitana, L. and J. Kaiser (2018) ‘Training a new generation of African drone pilots’, WeRobotics, https://blog.werobotics.org/2018/04/05/training-new-drone-pilots-in-tanzania (accessed 28 April 2019).

Udupa, S. (2015) Making News in Global India: Media, Publics, Politics (Cambridge: Cambridge University Press).

United Nations International Children’s Emergency Fund (UNICEF) (2017) ‘Africa’s first humanitarian drone testing corridor launched in Malawi by Government and UNICEF’, UNICEF Stories of Innovation, 29 June, http://unicefstories.org/2017/06/29/africas-first-humanitarian-drone-testing-corridor-launched-in-malawi-by-government-and-unicef (accessed 28 April 2019).

— (2018) ‘UNICEF in collaboration with the government of the Republic of Kazakhstan establishes two drone testing corridors in Kazakhstan’, 22 Oct., http://unicefstories.org/2018/10/22/dronecorridorkazakhstan (accessed 28 April 2019).

Vardi, M.Y. (2018) ‘Move fast and break things’, Communications of the ACM, 61 (9): 7, https://doi.org/10.1145/3244026.

Wallace-Wells, B. (2014) ‘Drones and everything after: the flying, spying, killing machines that are turning humans into superheroes’, New Yorker, 5 Oct., https://nymag.com/intelligencer/2014/10/drones-the-next-smartphone.html (accessed 14 March 2020).

World Bank (2018) ‘United Nations, World Bank, and humanitarian organizations launch innovative partnership to end famine’, 23 Sept., https://www.worldbank.org/en/news/press-release/2018/09/23/united-nations-world-bank-humanitarian-organizations-launch-innovative-partnership-to-end-famine (accessed 28 April 2019).

World Economic Forum (2018) ‘3 ways big data is changing the humanitarian sector’, 16 Jan., https://www.weforum.org/agenda/2018/01/3-ways-big-data-is-changing-the-humanitarian-sector (accessed 28 April 2019).

World Food Programme (WFP) Innovation Accelerator (2019) ‘UAVs for cargo delivery’, 1 Jan., https://innovation.wfp.org/project/uavs-cargo-delivery (accessed 28 April 2019).

World Food Programme (WFP) Insight (2019a) ‘Above the clouds: how drones can support early warning systems in Bolivia’, 17 Jan., https://insight.wfp.org/above-the-clouds-37818e954bfb (accessed 28 April 2019).

World Food Programme (WFP) Insight (2019b) ‘Drones to the rescue as Cyclone Desmond storms Mozambique’, 24 Jan., https://insight.wfp.org/drones-to-the-rescue-as-cyclone-desmond-storms-mozambique-d7f501e40b0f (accessed 28 April 2019).

Xu, F., Z. Tu, Y. Li, P. Zhang, X. Fu and D. Jin (2017) ‘Trajectory recovery from ash: user privacy is not preserved in aggregated mobility data’, in Proceedings of the 26th International Conference on World Wide Web (New York: ACM), pp. 1241–50.

Yarrish, G. (2011) ‘The good old days: the birth of RC’, Model Airplane News, https://www.modelairplanenews.com/the-first-days-of-rc (accessed 28 April 2019).

Zipline (2019) ‘Our impact’, https://flyzipline.com/impact (accessed 28 April 2019).

Zuboff, S. (2015) ‘Big other: surveillance capitalism and the prospects of an information civilization’, Journal of Information Technology, 30 (1): 75–89, https://doi.org/10.1057/jit.2015.5.

— (2019) The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power (New York: PublicAffairs).

Annotate

Next Chapter
5. The role of data collection, mapping and analysis in the reproduction of refugeeness and migration discourses: reflections from the Refugee Spaces project
PreviousNext
Text © contributors, 2020; Images © contributors and copyright holders named in captions, 2020
Powered by Manifold Scholarship. Learn more at
Opens in new tab or windowmanifoldapp.org