Skip to content
Snippets Groups Projects

Compare revisions

Changes are shown as if the source revision was being merged into the target revision. Learn more about comparing revisions.

Source

Select target project
No results found

Target

Select target project
  • colmoneill/social-justice-and-education
1 result
Show changes
Commits on Source (2)
......@@ -13,7 +13,7 @@ header-includes:
- \lhead{\thepage}
- \chead{Permacomputing for Wilderland}
- \rhead{Literature Review}
- \lfoot{2024}
- \lfoot{2025}
- \cfoot{}
- \rfoot{}
- \renewcommand{\headrulewidth}{0.4pt}
......@@ -31,7 +31,7 @@ pandoc --metadata --link-citations=true --citeproc "Literature-review.md" --bib
## List of abbreviations:
* IR: Information Retrieval
* LLM: Large Languae Models
* LLM: Large Language Models
* AI: Artificial Intelligence
* UbiComp: Ubiquitous Computing
* ML: Machine Learning
......@@ -39,9 +39,9 @@ pandoc --metadata --link-citations=true --citeproc "Literature-review.md" --bib
## Introduction
A main focus of the RQs is permacomputing, which is defined as a "potential field of convergence between technology, cultural work, environmental research and activism" [@mansouxPermacomputingAestheticsPotential2023a]. Permacomputing is also considered to be a nascent concept, that will be developed through a community of artists, activists, researchers, ecologists, alternative computing folk and practitioners. Not much literature exists specifically on the topic of on permacomputing, however, communities such as the artists and academics of "Computing within Limits" conference have been developing the idea that computation needs to embrace limitations since 2015. This dissertation seeks to apply permacomputing to education contexts and its struggles, therefore three literary areas have been selected to position the work against the landscape of existing research. They are: Decolonial Computing; Digital activism; and Critical edTech. The common grounds between these three are multiple —digital matter, critical stances, focus on social justice— but their origins are, clearly different. Decolonial computing is the most direct application to the field of computing itself, it is interested in what digital infrastructure is, where it is, where it comes from, and who controls it. The underlaying connection between colonial attitudes and resource exploitation is the primary vector that I use to develop this conversation, inscribing permacomputing as a decolonial computing project, and vice versa. The second literary area, Digital activism, works from within institutional constraints but seeks to highlight contradictions through "paranodal" [@blasContraInternetJournal742016] (that question centrality and structure) rethinking of computing and networking. Lastly, Critical edTech is the critical study of Education Technology. It is interested in how digital literacy is mediated, questions the inherent use of technology in education, and contrasts educational benefits with technology vendors motivation.
A main focus of the RQs is permacomputing, which is defined as a "potential field of convergence between technology, cultural work, environmental research and activism" [@mansouxPermacomputingAestheticsPotential2023a]. Permacomputing is also considered to be a nascent concept, that will be developed through a community of artists, activists, researchers, ecologists, alternative computing folk and practitioners. Not much literature exists specifically on the topic of on permacomputing, however, communities such as the artists and academics of "Computing within Limits" conference have been developing the idea that computation needs to embrace limitations since 2015. This dissertation seeks to apply permacomputing to education contexts and its struggles, therefore three literary areas have been selected to position the work against the landscape of existing research. They are: Decolonial Computing; Digital activism; and Critical edTech. The common grounds between these three are multiple —digital matter, critical stances, focus on social justice— but their origins are, clearly different. Decolonial computing is the most direct application to the field of computing itself, it is interested in what digital infrastructure is, where it is, where it comes from, and who controls it. The underlying connection between colonial attitudes and resource exploitation is the primary vector that I use to develop this conversation, inscribing permacomputing as a decolonial computing project, and vice versa. The second literary area, Digital activism, works from within institutional constraints but seeks to highlight contradictions through "paranodal" [@blasContraInternetJournal742016] (that question centrality and structure) rethinking of computing and networking. Lastly, Critical edTech is the critical study of Education Technology. It is interested in how digital literacy is mediated, questions the inherent use of technology in education, and contrasts educational benefits with technology vendors motivation.
The selection process of literature for this review aims to form a contextual framework, as permacomputing is a novel notion, and it does not have a specific body of literature dedicated to it yet. The contextual framework uses the three literary areas are present to interdisciplinarily inform the research questions, position the dissertation and guide the research process. In practice the selection of literature involved using the search terms 'permacomputing', 'decolonial computing', 'digital activism', 'critical edTech' and stylistic variations of the terms on academic search platforms. Closed access, subscription based tools JSTOR and IEEE (Institute of Electronics and Electrical Engineering) returned sparse results. Hybrid access tools Google Scholar and OneSearch, in combination with direct searching on industry specific publications, in particular the Association for Computing Machinery publications provided most of the literature. Intentional exclusions were made on the basis that they did not contribute to the contextual framework or informed the research question. Unintentional exclusions were likely made due to Anglocentric linguistic bias, as well as fairness, selection, representation, algorithmic, data and user bias in academic research as identified by Gao et al. [-@gaoAddressingBiasFairness2021] and Das et al. [-@dasDecolonialPostcolonialComputing2022]. The shortcoming in Information Retrieval (IR) systems noted above are particularly relevant here as authors and researchers who are concerned with permacomputing, and the other areas of this review, are also likely to be resisting large scale indexation analysis (also known as web crawlers) for contemporary Large Language Models (LLMs) and overhyped Artificial Intelligence (AI) tools. Otherwise said, it is likely that researchers are contributing to permacomputing literature, but they may not be found via (this) academic literature review process, another reason for the contextual framework required here.
The selection process of literature for this review aims to form a contextual framework, as permacomputing is a novel notion. The contextual framework uses the three literary areas are present to interdisciplinarily inform the research questions, position the dissertation and guide the research process. In practice the selection of literature involved using the search terms 'permacomputing', 'decolonial computing', 'digital activism', 'critical edTech' and stylistic variations of the terms on academic search platforms. Closed access, subscription based tools JSTOR and IEEE (Institute of Electronics and Electrical Engineering) returned sparse results. Hybrid access tools Google Scholar and OneSearch, in combination with direct searching on industry specific publications, in particular the Association for Computing Machinery publications provided most of the literature. Intentional exclusions were made on the basis that they did not contribute to the contextual framework or informed the research question. Unintentional exclusions were likely made due to Anglocentric linguistic bias, as well as fairness, selection, representation, algorithmic, data and user bias in academic research as identified by Gao et al. [-@gaoAddressingBiasFairness2021] and Das et al. [-@dasDecolonialPostcolonialComputing2022]. The shortcoming in Information Retrieval (IR) systems noted above are particularly relevant here as authors and researchers who are concerned with permacomputing, and the other areas of this review, are also likely to be resisting large scale indexation analysis (also known as web crawlers) for contemporary Large Language Models (LLMs) and overhyped Artificial Intelligence (AI) tools. Otherwise said, it is likely that researchers are contributing to permacomputing literature, but they may not be found via (this) academic literature review process, another reason for the contextual framework required here.
<!--
A note on why large books like the list below cannot be included ?
......@@ -56,7 +56,7 @@ A note on why large books like the list below cannot be included ?
The focus of this first section of literature review is the critical conversation that describes the colonial attitudes of computing, and the subsequent proposals and efforts on how to decolonise it. In the next paragraphs I will use selected literature to highlight the elements of computing that are or show traces of colonial "conquest patterns", colonial attitudes towards resources, how they are obtained (such as how data is obtained, or harvested), how the resource thinking extends to people and their exploitation, all in an effort to situate permacomputing within the debate of decolonial computing and situate this study within this broader context. Some literature addresses Decolonial Computing directly, although the idea is not widespread. What follows is an explanation of the term and its origins, to begin building the contextual framework.
Starting from a zoomed out point of view, one analysis of existing literature on Decolonial and Postcolonial Computing Research [@dasDecolonialPostcolonialComputing2022] uses scientometric, systematic literature review to consider 115 papers’ metadata and infer research trends and popular publication venues in these areas. It reveals a community of researchers bound by their adoption of decolonial computing and education as a theoretical framework. The research within that community is mostly on community focused topics and digital literacy. A more theoretical starting point within this literature is the work of Mustafa Syed Ali in 2014-2016. Ali distinguishes from, yet acknowledges the contributions of Postcolonial Computing (which engages with the broader critiques of postcolonial thinking) and pushes towards 'decolonial thinking' which "re-conceptualises analysis of the world system from the (Southern/Non-European) margins/periphery, rather than the (Northern/European) core" [@aliBriefIntroductionDecolonial2016]). Ali embraces 'decolonial turns' and proposes "decolonial computing [...] as a response to computing's 'colonial impulse'" [@aliDecolonialComputing2014].
Starting from a distant view, one analysis of existing literature on Decolonial and Postcolonial Computing Research [@dasDecolonialPostcolonialComputing2022] uses scientometric, systematic literature review to consider 115 papers’ metadata and infer research trends and popular publication venues in these areas. It reveals a community of researchers bound by their adoption of decolonial computing and education as a theoretical framework. The research within that community is mostly on community focused topics and digital literacy. A more theoretical starting point within this literature is the work of Mustafa Syed Ali in 2014-2016. Ali distinguishes from, yet acknowledges the contributions of Postcolonial Computing (which engages with the broader critiques of postcolonial thinking) and pushes towards 'decolonial thinking' which "re-conceptualises analysis of the world system from the (Southern/Non-European) margins/periphery, rather than the (Northern/European) core" [@aliBriefIntroductionDecolonial2016]). Ali embraces 'decolonial turns' and proposes "decolonial computing [...] as a response to computing's 'colonial impulse'" [@aliDecolonialComputing2014].
This essential contribution, that *computing [has] colonial impulse[s]* is made by Paul Dourish and Scott Mainwaring [-@dourishUbicompColonialImpulse2012]. Originating in the study of UbiComp (Ubiquitous Computing), using a combination of literature review, historical analysis, critical reflection and interpretative methodology, they argue that connections between computing and coloniality are rooted in the basics of what computing is, and how it is intended to work: computing, at its core, seeks to standardise or universalise systems of communication and knowledge. This reflects ideas of modernist state apparatuses that "regulate and operate through standardization and the imposition of homogenization" (ibid.). These practices, in modern computing, are often justified through benefits of universal access and usefulness, but often produce epistemic injustice. Dourish and Mainwaring take for example Google's mission statement to "organize the world's information and make it universally accessible and useful". They point out that the terms 'world', 'universal' and 'useful' have "significant strategic power" (ibid.). Despite this mission statement, it is essential to remember that Google is a multinational corporation, whose main income originates from advertising, not philanthropic information access. This 'colonial impulse' is the broader observation that "the colonial project has an afterlife in the post-colonial era in terms of the structuring logics of coloniality, of social ordering, systems of knowledge production and dissemination [and] ways of ordering the live world of interaction." [@aliDecolonisingComputing2020].
......@@ -65,23 +65,23 @@ This form of computing, which has become the mainstream, only works through the
Ali [-@aliBriefIntroductionDecolonial2016] takes the above and uses critical theory, decolonial thought, case study analysis and interdisciplinary approaches to propose Decolonial Computing. This is a critical project that interrogates "who is doing computing, where they are doing it, and, thereby, what computing means both epistemologically and ontologically". Further, decolonial computing observes the "phenomenon of computing from a perspective informed by (even if not situated at) the margins or periphery of the modern world system wherein issues of body politics and geo-politics are analytically foregrounded" [@aliBriefIntroductionDecolonial2016]. With computing's colonial impulses established, and the critical project of Decolonial Computing in mind, the next sections consider other research that take these starting points, and contribute to this specific contextual framework.
Micheal Kwet, another prominent voice in the field of decolonial computing, argues that big technology corporations (Big Tech) are 'reinventing colonialism' through their control and domination of digital technology. Using combinations of critical, historical and comparative analysis, with an interdisciplinary approach, Kwet
([-@kwetDigitalColonialismSouth2019],[-@kwetDigitalColonialismUS2019], [-@kwetDigitalTechDeal2022], [-@kwetDigitalEcosocialismBreaking2022], [-@kwetMichaelKwetBig2023]) states that instead of the traditionally colonial quest of land, a 'handful of US multinational corporations are colonising digital technology' [-@kwetDigitalColonialismUS2019]. Kwet compares imperial conquests, which involve dispossession of valuable resources from native peoples and ownership and control of infrastructure, with digital infrastructure. Digital forms of power are broken down into core pillars: software, hardware, network connectivity, intellectual property, the means of retribution and the derived data. Kwet shows that all of these forms are dominated by few corporations and some key mechanisms. Issues of digital colonialism extend to ways in it protects its means of production through intellectual property mechanisms, which we return to in the second part of this review.
([-@kwetDigitalColonialismSouth2019],[-@kwetDigitalColonialismUS2019], [-@kwetDigitalTechDeal2022], [-@kwetDigitalEcosocialismBreaking2022], [-@kwetMichaelKwetBig2023]) states that instead of the traditionally colonial quest of land, a 'handful of US multinational corporations are colonising digital technology' [-@kwetDigitalColonialismUS2019]. Kwet compares imperial conquests, which involve dispossession of valuable resources from native peoples and ownership and control of infrastructure, with digital infrastructure. Digital forms of power are broken down into core pillars: software, hardware, network connectivity, intellectual property, the means of retribution and the derived data. Kwet shows that all of these forms are dominated by few corporations and some key mechanisms. Issues of digital colonialism extend to ways in it protects its means of production through intellectual property mechanisms, which is considered later.
It is worthwhile noting that mainstream critiques of Big Tech are beginning to acknowledge epistemic injustice. Until now, many critiques spoke primarily of issues of bias, but this conversation is evolving. A journalist writing about the changing landscape of online search and IR makes Kwet's arguments more specific. In an article that comments on the 'generative AI race', Linehan says that for decades, what the Big Tech companies did, was to interpose themselves between the user and the digital experience, in order to make money from it [@pollakWhatGooglepocalypseHow]. In the advent of generative AI and large language models (intensive digital processes that involve indiscriminate ingestion of usually copyrighted material to generate 'authorless' responses), there is an opportunity for these companies to not only be "mere gatekeepers of the world's information, but its tollbooth operator as well" [@linehanGooglepocalypseSweepingUnited2024]. This journalistic example is included to provide a non-academic perspective on computing's colonial attitudes, which here echo Kwet's corporate colonisation of digital technology and Zuboff's "conquest patterns".
It is worthwhile noting that mainstream critiques of Big Tech are beginning to acknowledge epistemic injustice. Until now, many critiques spoke primarily of issues of bias, but this conversation is evolving. A journalist writing about the changing landscape of online search and IR makes Kwet's arguments more specific. In an article that comments on the 'generative AI race', Linehan says that for decades, what the Big Tech companies did, was to interpose themselves between the user and the digital experience (see 'assetisation' in part 3 of this review), in order to make money from it [@pollakWhatGooglepocalypseHow]. In the advent of generative AI and large language models (intensive digital processes that involve indiscriminate ingestion of usually copyrighted material to generate 'authorless' responses), there is an opportunity for these companies to not only be "mere gatekeepers of the world's information, but its tollbooth operator as well" [@linehanGooglepocalypseSweepingUnited2024]. This journalistic example is included to provide a non-academic perspective on computing's colonial attitudes, which here echo Kwet's corporate colonisation of digital technology and Zuboff's "conquest patterns".
Contributing to the debate on Decolonial Computing are the works co-authored by Ulises Ali Meijas & Nick Couldry. Two books are keystones of the conversation on Digital Colonialism, that employ perspectives from sociology, communication studies and political analysis via empirical data, case studies, conceptual analysis, and critical theory. Firstly 'The Cost of Connection' [-@couldryCostsConnectionHow2019a] puts forwards the notion of data colonialism, its implications and how it reinforces global power imbalances. Data colonialism is the pervasive process by which human life is increasingly subject to the extraction of data, often without consent, where digital information resulting from the capture of interactions between humans is as valuable as what sites they visit and what search queries they may perform. (Consider WhatsApp as an example of this: it is commonly understood that messaging apps like Whatsapp, are more interested in metadata than message content, as the content itself is encrypted and cannot be read by anybody but the sender or receiver. WhatsApp derives profit from who is being messaged, when, how frequently, and in what groups they are in, capturing interaction as salable data.) Slightly outside the scope of this review on Decolonial Computing but informing the contextual framework, this critical understanding of data and the mechanisms it is obtained by is described at length in the popular book by Shoshanna Zuboff, 'The Age of Surveillance Capitalism' [-@zuboffAgeSurveillanceCapitalism2019]. Zuboff's book also provides applications of "conquest patterns" in Big Tech, making the clear connections between Big Tech, capitalism and digital colonialism. In their second book, 'Data Grab: The New Colonialism of Big Tech and How to Fight Back' [-@mejiasDataGrabNew2024] the authors detail how modern data practices represent a new form of colonialism. If historical colonialism involved extraction of physical and labour resources from communities and lands, this new digital colonialism extracts unconditionally from individuals, sometimes aimlessly, as a raw material that converts into future corporate profits. The first book focuses principally on digital data, critiquing the colonial mindset that was behind the late 2010s refrain of "Data is the new oil" (see [@theeconomistWorldMostValuable2027], [@giacagliaDataNewOil2019], [@sadowskiWhenDataCapital2019]), whereas the second book puts the responsibility of this 'Data Grab' with the Big Tech companies that profit from it. These notions provided by Meijas and Couldry are not only parts of responses to Ali's Decolonial Computer critical project that seeks 'who' and 'how' computing is done today, but they also provide active methodologies on how to resist or undo the imbalances created by these outlooks. We shall return to these methodologies in the 'Digital Activism' section of this review.
Contributing to the debate on Decolonial Computing are the works co-authored by Ulises Ali Meijas & Nick Couldry. Two books are keystones of the conversation on Digital Colonialism, that employ perspectives from sociology, communication studies and political analysis via empirical data, case studies, conceptual analysis, and critical theory. Firstly 'The Cost of Connection' [-@couldryCostsConnectionHow2019] puts forwards the notion of data colonialism, its implications and how it reinforces global power imbalances. Data colonialism is the pervasive process by which human life is increasingly subject to the extraction of data, often without consent, where digital information resulting from the capture of interactions between humans is as valuable as what sites they visit and what search queries they may perform. (Consider WhatsApp as an example of this: it is commonly understood that messaging apps like Whatsapp, are more interested in metadata than message content, as the content itself is encrypted and cannot be read by anybody but the sender or receiver. WhatsApp derives profit from who is being messaged, when, how frequently, and in what groups they are in, capturing interaction as salable data.) Slightly outside the scope of this review on Decolonial Computing but informing the contextual framework, this critical understanding of data and the mechanisms it is obtained by is described at length in the popular book by Shoshanna Zuboff, 'The Age of Surveillance Capitalism' [-@zuboffAgeSurveillanceCapitalism2019]. Zuboff's book also provides applications of "conquest patterns" in Big Tech, making the clear connections between Big Tech, capitalism and digital colonialism. In their second book, 'Data Grab: The New Colonialism of Big Tech and How to Fight Back' [-@mejiasDataGrabNew2024] the authors detail how modern data practices represent a new form of colonialism. If historical colonialism involved extraction of physical and labor resources from communities and lands, this new digital colonialism extracts unconditionally from individuals, sometimes aimlessly, as a raw material that converts into future corporate profits. The first book focuses principally on digital data, critiquing the colonial mindset that was behind the late 2010s refrain of "Data is the new oil" (see [@theeconomistWorldMostValuable2027], [@giacagliaDataNewOil2019], [@sadowskiWhenDataCapital2019]), whereas the second book puts the responsibility of this 'Data Grab' with the Big Tech companies that profit from it. These notions provided by Meijas and Couldry are not only parts of responses to Ali's Decolonial Computer critical project that seeks 'who' and 'how' computing is done today, but they also provide active methodologies on how to resist or undo the imbalances created by these outlooks. We shall return to these methodologies in the 'Digital Activism' section of this review.
Another key contribution in the study of Decolonial Computing is made by Abeba Birhane and Olivia Guest in their paper 'Towards Decolonising Computation Sciences' [-@birhaneDecolonisingComputationalSciences2021]. They intersectionally analyse case studies to highlight and discuss the need to dismantle the colonial and oppressive structures embedded within computational sciences. They argue current systems in the field perpetuate sexism and scientific racism, to name but a few. They also demonstrate that conclusions in many scientific fields are based on Western data sets that skews outputs, and reinforces supremacy. Next, they reject tokenistic efforts such as diversity boards and favour of deep re-examination of peer reviewed publications and publication systems. Finally, the authors conclude that it is "the broader acedemic ecosystem that needs to be [...] reimagined in a manner that explicitly excludes harmful pseudoscience and subtly repackaged white supremacism from its model" [[@birhaneDecolonisingComputationalSciences2021]] to ensure that marginalised voices are heard. This article provides demonstrations of the issues and offers practical outcomes, but it also demonstrates how the colonial attitudes of digital technology have permeated into academic and scientific spaces. Abeba Birhane's work propels the field of Decolonial Computing forwards, with other writings involving the description of 'Algorithmic Colonisation', whereby historical injustices and power imbalances are built in to the inners of computation systems like algorithms, that may decide what a person is shown or not shown in, for example, search results or social media feeds. Another Birhane contribution describes Relational Ethics [-@birhaneAlgorithmicInjusticeRelational2021a] which challenges dominant and individualistic approaches to ethics in AI development and computer sciences by highlighting the importance of relationships (interconectedness) and context (financial, geographical, political) for new work to be published. Birhane's work also explains that non-western lands, where computation is not as intense or present, can become targets for algorithmic colonisation. In a paper that focuses on the Algorithmic Colonisation of Africa [-@birhaneAlgorithmicColonizationAfrica2020], it is shown that, especially in the industry of AI, Western tech monopolies impose their ways of doing, without any consideration for local differences or indeed needs. Here, Birhane directly cites Zuboff's "conquest patterns", connecting Big Tech, computing, capitalism and colonialism once again.
Another key contribution in the study of Decolonial Computing is made by Abeba Birhane and Olivia Guest in their paper 'Towards Decolonising Computation Sciences' [-@birhaneDecolonisingComputationalSciences2021]. They intersectionally analyse case studies to highlight and discuss the need to dismantle the colonial and oppressive structures embedded within computational sciences. They argue current systems in the field perpetuate sexism and scientific racism, to name but a few. They also demonstrate that conclusions in many scientific fields are based on Western data sets that skews outputs, and reinforces supremacy. Next, they reject tokenistic efforts such as diversity boards and favor of deep re-examination of peer reviewed publications and publication systems. Finally, the authors conclude that it is "the broader acedemic ecosystem that needs to be [...] reimagined in a manner that explicitly excludes harmful pseudoscience and subtly repackaged white supremacism from its model" [@birhaneDecolonisingComputationalSciences2021] to ensure that marginalised voices are heard. This article provides demonstrations of the issues and offers practical outcomes, but it also demonstrates how the colonial attitudes of digital technology have permeated into academic and scientific spaces. Abeba Birhane's work propels the field of Decolonial Computing forwards, with other writings involving the description of 'Algorithmic Colonisation', whereby historical injustices and power imbalances are built in to the inners of computation systems like algorithms, that may decide what a person is shown or not shown in, for example, search results or social media feeds. Another Birhane contribution describes Relational Ethics [-@birhaneAlgorithmicInjusticeRelational2021a] which challenges dominant and individualistic approaches to ethics in AI development and computer sciences by highlighting the importance of relationships (interconectedness) and context (financial, geographical, political) for new work to be published. Birhane's work also explains that non-western lands, where computation is not as intense or present, can become targets for algorithmic colonisation. In a paper that focuses on the Algorithmic Colonisation of Africa [-@birhaneAlgorithmicColonizationAfrica2020], it is shown that, especially in the industry of AI, Western tech monopolies impose their ways of doing, without any consideration for local differences or indeed needs. Here, Birhane directly cites Zuboff's "conquest patterns", connecting Big Tech, computing, capitalism and colonialism once again.
Returning to later thinking from Ali [@aliDecolonisingComputing2020], a proposal is made to shift away from the language of bias. Typically, in the developing techno-critical discourse of the last two decades, problematic technologic systems [^biasproblems] are classed as issues of bias, whereby training data, used for example for machine learning (ML) or AI, is fed into a system to construct a synthetic understanding of the inputs, and later analyse other data. The argument is that if this training data itself, is biased (or contains misrepresentations, or incomplete representations), then this pattern will percolate through the system and produce biased results. This bias perspective is highlighted by multiple researchers, such as Kate Crawford [-@crawfordOpinionArtificialIntelligence2016], Safiya Noble [-@nobleAlgorithmsOppressionHow2018], Cathy O'Neil [@-oneilWeaponsMathDestruction2017], Virginia Eubanks [-@eubanksAutomatingInequalityHow2019], and Ruha Benjamin [-@benjaminRaceTechnologyAbolitionist2019], and shows how AI systems (but also more broadly computing) replicate and amplify existing social inequalities. Ali takes the view that focusing on bias highlights the technicality of datasets and abstract mathematics. But computation is inseparable from human beings, and is fundamentally sociotechnical in nature. Computing should be viewed as inherently sociotechnical rather than purely technical. All aspects of computing—from technology deployment and design to the underlying mathematics—are fundamentally human activities. Focusing solely on the technical aspect neglects the essential social and human components, which should not be treated as an afterthought but as integral to the nature of computing. Ali argues that merely diversifying data sets without addressing deployment contexts and systemic power imbalances can lead to superficial fixes that don't resolve deeper structural issues. Powles and Nissenbaum [-@powlesSeducVeDiversion2018] support this by arguing that focusing solely on AI bias serves as a diversion from more significant concerns about the broader societal impact of AI technologies. They contend that addressing bias within the existing framework of automation fails to tackle the root social causes of discrimination and may inadvertently contribute to systems of surveillance and classification. Their critique stresses that the current emphasis on "fixing" AI systems instead of questioning their very necessity or the ethical implications of their deployment leads to a resignation to technology's encroaching dominance. Ultimately, they call for a radical rethinking of data control, robust accountability mechanisms, and the possibility of halting harmful AI systems to prioritize public interest and societal well-being. Said otherwise, using the language of bias can be a helpful way of socially explaining the harms resulting from computing, but it is essential to halt solutionist thinking towards 'fixing' the bias issues and move to consider if these systems are needed at all. This forms a basis for an activist response of refusal, which will be further discussed in the second section of this review.
Returning to later thinking from Ali [@aliDecolonisingComputing2020], a proposal is made to shift away from the language of bias. Typically, in the developing techno-critical discourse of the last two decades, problematic technologic systems [^biasproblems] are classed as issues of bias, whereby training data, used for example for machine learning (ML) or AI, is fed into a system to construct a synthetic understanding of the inputs, and later analyse other data. The argument is that if this training data itself, is biased (or contains misrepresentations, or incomplete representations), then this pattern will percolate through the system and produce biased results. This bias perspective is highlighted by multiple researchers, such as Kate Crawford [-@crawfordOpinionArtificialIntelligence2016], Safiya Noble [-@nobleAlgorithmsOppressionHow2018], Cathy O'Neil [-@oneilWeaponsMathDestruction2017], Virginia Eubanks [-@eubanksAutomatingInequalityHow2019], and Ruha Benjamin [-@benjaminRaceTechnologyAbolitionist2019], and shows how AI systems (but also more broadly computing) replicate and amplify existing social inequalities. Ali takes the view that focusing on bias highlights the technicality of datasets and abstract mathematics. But computation is inseparable from human beings, and is fundamentally sociotechnical in nature. Computing should be viewed as inherently sociotechnical rather than purely technical. All aspects of computing—from technology deployment and design to the underlying mathematics—are fundamentally human activities. Focusing solely on the technical aspect neglects the essential social and human components, which should not be treated as an afterthought but as integral to the nature of computing. Ali argues that merely diversifying data sets without addressing deployment contexts and systemic power imbalances can lead to superficial fixes that don't resolve deeper structural issues. Powles and Nissenbaum [-@powlesSeducVeDiversion2018] support this by arguing that focusing solely on AI bias serves as a diversion from more significant concerns about the broader societal impact of AI technologies. They contend that addressing bias within the existing framework of automation fails to tackle the root social causes of discrimination and may inadvertently contribute to systems of surveillance and classification. Their critique stresses that the current emphasis on "fixing" AI systems instead of questioning their very necessity or the ethical implications of their deployment leads to a resignation to technology's encroaching dominance. Ultimately, they call for a radical rethinking of data control, robust accountability mechanisms, and the possibility of halting harmful AI systems to prioritize public interest and societal well-being. Said otherwise, using the language of bias can be a helpful way of socially explaining the harms resulting from computing, but it is essential to halt solutionist thinking towards 'fixing' the bias issues and move to consider if these systems are needed at all. This forms a basis for an activist response of refusal, which will be further discussed in the second section of this review.
Finally, to fully integrate decolonial computing into the context of permacomputing, it's crucial to address the ecological concerns associated with the production and use of computing electronics. Decolonial computing seeks to examine who engages in computing, where it takes place, and its broader implications. Consequently, the environmental damage caused by computing [@katwalaSpirallingEnvironmentalCost2018] should be recognised as another form of colonial resource exploitation [@arboledaPlanetaryMineTerritories2020]. The literature has highlighted how computing exhibits colonial tendencies, follows patterns of conquest, occupies digital infrastructures, exploits data indiscriminately, and perpetuates epistemic injustice and power imbalances. These behaviors also extend to the physical demands of computing hardware. The extractive mining of vital minerals like cobalt, nickel, and lithium, predominantly sourced from Global South countries such as the Democratic Republic of Congo, Chile, Argentina, and China, is environmentally destructive and reinforces the imperialistic nature of computing. Crary's book 'Scorched Earth' [-@craryScorchedEarthDigital2022] is another important author in this space. Using cultural critique, philosophical enquiry and a critical theory approach, Crary emphasises that it is the relentless expansion the digital processes that has led to widespread environmental destruction, which is particularly troubling in the ever more power and hardware hungry age of ML and LLMs. Kwet [-@kwetDigitalEcosocialismBreaking2022] has written extensively on the connection between colonial attitudes in digital spaces and planetary resource exhaustion, and moves makes proposals towards Digital EcoSocialism, arguing that "Environmental justice and digital justice go hand-in-hand, and the two movements must link up to achieve their goals."
## Digital activism
The literary area of Digital Activism, which observes when and how digital media is used for social and political change, is vast. Currently, lots of research in this field is on digital activism through social media platforms and analysis of the digital communication and community practices. The majority of the analysis of digital activism is focused on 'hashtag activism' (see @jacksonHashtagActivismNetworksRace2020, @xiePraxisHashtagActivism2023, @estrella-ramonHashtagActivismTwitter2024), these studies tend to have techno-optimist perspectives and often focus on singular platforms such as Twitter or Instagram. In the building of a permacomputing contextual framework, literature on the use of mainstream social media for activism have been rejected. The initial reason for this exclusion is that permacomputing intends to be a resource-aware, human-scaled form of computation, which is not the case of centralised social media which is terminally power and resource hungry. A more fundamental reason for the rejection of this type of activism in this literature review is the undeniable contradiction that arises from the use of a hegemonic social media system for activist, liberatory, social justice or counter-hegemonic causes. Although 'hashtag activism' studies are being excluded from this review, debates about its effectiveness are included. A very well articulated argument is put forwards by Aouragh et al. [-@aouraghFCJ196LetFirst2015a] who's paper highlights the paradox of "counter-hegemonic activist groups becoming complicit with the labour conditions and logic of profit inherent to commercial social platforms". Their text uses critical analysis to detail how digital tools used by activist and education organisations require greater scrutiny and propose digital departure and digital refusal as a technological means of resistance of centralised powers. This contradiction of activism happening within closed and commercial systems is also the starting point of @aleefrostPoisonedChaliceHashtag2020 who critically reviews the book "Hashtag Activism" [-@jacksonHashtagActivismNetworksRace2020]. Their critiques are that the multiple stories of hashtag campaign activism told in the book only "measure the influence of Twitter activism according to the metrics of Twitter itself: the saturation and permeation of slogans, platitudes and their attendant discourse online." A'Lee Frost goes on to highlight that even the most popular hashtag activism campaigns have had few victories or lasting legacies of power. This critique is articulated around the work of Zeynep Tufekci, a consistent skeptic of "Social Media's *transformative* potential". Lightly cited in the book but clarified in the review, Tufekci's views (articulated in 'Twitter and tear gas: the power and fragility of networked protest [-@tufekciTwitterTearGas2017]) are "that government and capital have more power than the masses over social media", and that "the digital economy of private companies running on ad revenue does not and cannot function as a tool of the people" @aleefrostPoisonedChaliceHashtag2020. Others support Tufekci's views, and widen the unsuitedness of mainstream social media platforms for digital activism. @deanCommunicativeCapitalismCirculation2005 argues that in such systems of "Communicative Capitalism" engagement and activism on social media absorbs dissent and replaces political action. @bennettLogicConnectiveAction2013 deepen this understanding by contrasting social media's "connective action" with traditional activism's "collective action", highlighting how "connective action" may allow for mass participation, and develops rapidly, but it can be fragmented and short-lived, with few examples of achievement of concrete political goals. The debate on the issue of hashtag activism, also known as "slacktivism" (see @christensenPoliticalActivitiesInternet2011, [@morozovNetDelusionDark2011a p.189]), acknowledges that social media campaigns can raise awareness of issues, but few forms of real engagement result. @bonillaFergusonDigitalProtest2015 propose "hashtag ethnography" as a new way of analysing digital social movements, and find that hashtag activism has more of a role of public storytelling, rather than direct action. Bonilla and Rosa recognise the issues with corporate co-optation within their study, and conclude that hashtags have the potential to increase awareness, but rarely lead to policy change or structural reform. Mechanisms of 'likes' and 'reposts' give an illusion of activism as they can gain traction and speed rapidly, but "in fact, the accelerated development of online activism belies a deadly progeria: it burns hotly, brightly, and briefly, often with nothing to show in the end but a glut of forgettable, disposable content and the emotional exhaustion of participants" (ibid.). Hashtag activism must be understood as performative, and a kind of participation "that is inclined toward a horizontal, identity-based movement-building" [@tufekciTwitterTearGas2017]. In a book on the failures of hashtag activism @schradieRevolutionThatWasnt2019 describes the "digital activism gap" where supposedly libreatory technologies have "ended up reproducing, and even intensifying, the preexisting power imbalances that permeate today's society" @trifiroRevolutionThatWasnt2021. The book's subtitle "how digital activism favors concervatives" is not claiming that the Internet is an inherently conservative endeavour, instead, "she is arguing that the political and communication goals conservatives persue online are easier to achieve [within communication capitalism / platform capitalism systems such as commercial social media] than those persued by the radical or reformist left" @karpfBookReviewRevolution2020.
The literary area of Digital Activism, which observes when and how digital media is used for social and political change, is vast. Currently, lots of research in this field is on digital activism through social media platforms and analysis of the digital communication and community practices. The majority of the analysis of digital activism is focused on 'hashtag activism' (see @jacksonHashtagActivismNetworksRace2020, @xiePraxisHashtagActivism2023, @estrella-ramonHashtagActivismTwitter2024), these studies tend to have techno-optimist perspectives and often focus on singular platforms such as Twitter or Instagram. In the building of a permacomputing contextual framework, literature on the use of mainstream social media for activism have been rejected. The initial reason for this exclusion is that permacomputing intends to be a resource-aware, human-scaled form of computation, which is not the case of centralised social media which is terminally power and resource hungry. A more fundamental reason for the rejection of this type of activism in this literature review is the undeniable contradiction that arises from the use of a hegemonic social media system for activist, liberatory, social justice or counter-hegemonic causes. Although 'hashtag activism' studies are being excluded from this review, debates about its effectiveness are included. A well articulated argument is put forwards by Aouragh et al. [-@aouraghFCJ196LetFirst2015a] who's paper highlights the paradox of "counter-hegemonic activist groups becoming complicit with the labor conditions and logic of profit inherent to commercial social platforms". Their text uses critical analysis to detail how digital tools used by activist and education organisations require greater scrutiny and propose digital departure and digital refusal as a technological means of resistance of centralised powers. This contradiction of activism happening within closed and commercial systems is also the starting point of @aleefrostPoisonedChaliceHashtag2020 who critically reviews the book "Hashtag Activism" [-@jacksonHashtagActivismNetworksRace2020]. Their critiques are that the multiple stories of hashtag campaign activism told in the book only "measure the influence of Twitter activism according to the metrics of Twitter itself: the saturation and permeation of slogans, platitudes and their attendant discourse online." A'Lee Frost goes on to highlight that even the most popular hashtag activism campaigns have had few victories or lasting legacies of power. This critique is articulated around the work of Zeynep Tufekci, a consistent skeptic of "Social Media's *transformative* potential". Lightly cited in the book but clarified in the review, Tufekci's views (articulated in 'Twitter and tear gas: the power and fragility of networked protest [-@tufekciTwitterTearGas2017]) are "that government and capital have more power than the masses over social media", and that "the digital economy of private companies running on ad revenue does not and cannot function as a tool of the people" @aleefrostPoisonedChaliceHashtag2020. Others support Tufekci's views, and widen the unsuitedness of mainstream social media platforms for digital activism. @deanCommunicativeCapitalismCirculation2005 argues that in such systems of "Communicative Capitalism" engagement and activism on social media absorbs dissent and replaces political action. @bennettLogicConnectiveAction2013 deepen this understanding by contrasting social media's "connective action" with traditional activism's "collective action", highlighting how "connective action" may allow for mass participation, and develops rapidly, but it can be fragmented and short-lived, with few examples of achievement of concrete political goals. The debate on the issue of hashtag activism, also known as "slacktivism" (see @christensenPoliticalActivitiesInternet2011, [@morozovNetDelusionDark2011a p.189]), acknowledges that social media campaigns can raise awareness of issues, but few forms of real engagement result. @bonillaFergusonDigitalProtest2015 propose "hashtag ethnography" as a new way of analysing digital social movements, and find that hashtag activism has more of a role of public storytelling, rather than direct action. Bonilla and Rosa recognise the issues with corporate co-optation within their study, and conclude that hashtags have the potential to increase awareness, but rarely lead to policy change or structural reform. Mechanisms of 'likes' and 'reposts' give an illusion of activism as they can gain traction and speed rapidly, but "in fact, the accelerated development of online activism belies a deadly progeria: it burns hotly, brightly, and briefly, often with nothing to show in the end but a glut of forgettable, disposable content and the emotional exhaustion of participants" (ibid.). Hashtag activism must be understood as performative, and a kind of participation "that is inclined toward a horizontal, identity-based movement-building" [@tufekciTwitterTearGas2017]. In a book on the failures of hashtag activism @schradieRevolutionThatWasnt2019 describes the "digital activism gap" where supposedly liberatory technologies have "ended up reproducing, and even intensifying, the preexisting power imbalances that permeate today's society" @trifiroRevolutionThatWasnt2021. The book's subtitle "how digital activism favors conservatives" is not claiming that the Internet is an inherently conservative endeavor, instead, "she is arguing that the political and communication goals conservatives pursue online are easier to achieve [within communication capitalism / platform capitalism systems such as commercial social media] than those pursued by the radical or reformist left" @karpfBookReviewRevolution2020.
The type of digital activism that is sought in this section then is one that is aware of the context of the tools and methods of its activism. One that is aware of the digital means of production [@aleefrostPoisonedChaliceHashtag2020]. Therefore, this section of the review specifically seeks to inform the 'oppositional knowledge' and alternative media elements of the RQs. Alternative media, defined as "media outlets that challenge dominant political, economic and media power" [@leeInternetAlternativeMedia2015] is thought of as a way of giving access to oppositional knowledge. Oppositional knowledge is knowledge that is "instrumental in the formation of critical attitudes towards dominant power and [lead to] actual participation in oppositional action" (ibid.). These are considered to be more radical activism, away from platform capitalism [@srnicekPlatformCapitalism2019] and Big Tech. The underlying mechanisms that are required for oppositional knowledge to take hold are antagonistic responses and the form of refusal outlined by Aouragh et al. The context of permacomputing and activism forwards the idea that refusal is, in itself, a form of activism, and one that this research project will require engagement with, as it answers its RQs. Antagonistic responses are also considered, as they can be a stepping stone to oppositional knowledge, and potentially refusal.
The type of digital activism that is sought in this section then is one that is aware of the context of the tools and methods of its activism. One that is aware of the digital means of production [@aleefrostPoisonedChaliceHashtag2020]. Therefore, this next section of the review specifically seeks to inform the 'oppositional knowledge' and alternative media elements of the RQs. Alternative media, defined as "media outlets that challenge dominant political, economic and media power" [@leeInternetAlternativeMedia2015] is thought of as a way of giving access to oppositional knowledge. Oppositional knowledge is knowledge that is "instrumental in the formation of critical attitudes towards dominant power and [lead to] actual participation in oppositional action" (ibid.). These are considered to be more radical activism, away from platform capitalism [@srnicekPlatformCapitalism2019] and Big Tech. The underlying mechanisms that are required for oppositional knowledge to take hold are antagonistic responses and the form of refusal outlined by Aouragh et al. The context of permacomputing and activism forwards the idea that refusal is, in itself, a form of activism, and one that this research project will require engagement with, as it answers its RQs. Antagonistic responses are also considered, as they can be a stepping stone to oppositional knowledge, and potentially refusal.
Kaun and Treré [-@kaunRepressionResistanceLifestyle2020a] provide a solid foundation for this type of activism that they call "push-back activism" with their typology of digital disconnection. They bring together literature under a disconnection studies heading, and utilise "insights from social movements studies to discuss the political potential and consequences of digital disconnection and activism". They develop a three-dimensional matrix of political practices that include "power, level of collectivity and temporality" that leads to three types of motivations for disconnection. These are:
......@@ -91,17 +91,17 @@ Kaun and Treré [-@kaunRepressionResistanceLifestyle2020a] provide a solid found
| Digital disconnection as resistance / Resistance-motivated | a) from below, fast pace, individual (or small networks) b) from below, fast pace, individual c) from below, relatively slow, collective | Here, disconnection is a deliberate act of resistance against the pervasive influence of digital capitalism. By disconnecting, individuals or groups reject the commodification of their data and the exploitation of their attention by tech corporations. This type of disconnection is a form of protest against the economic and social structures that digital technologies support. |
| Digital disconnection as lifestyle politics / Lifestyle-motivated | a) from below, fast pace, highly individualized b) from above, slow, individualized | This type focuses on the personal and psychological reasons for disconnecting, often as a way to reclaim time, attention, and well-being from the constant demands of digital life. Individuals may disconnect to resist the addictive nature of social media, reduce stress, or foster deeper, more meaningful offline relationships. While this type of disconnection is personal, it also carries political implications by challenging the norms of digital engagement and consumerism. |
Considering this review's focus on permacomputing, which has multiple motivations, but is lead by ecological concerns, this review continues with a focus on resistance-motivated disconnection. Disconnections or refusals can happen on an individual or a collective basis. Notable individual resistance-motivated disconnection techniques involve data obfuscation projects, such as wearable 'smart assistant' microphone jamming jewlery [@chenWearableMicrophoneJamming2020] or artist image production protection tool Glaze, which introduces "barely perceptible perturbations to images" to poison Generative AI models that copy and mimic without accreditation or licence [@shanGlazeProtectingArtists2023]. Collective forms of resistance-motivated disconnection involve more extensive non-participation and the fostering of community-managed spaces, such as Mastodon, a federated alternative medias, or Free Libre Open Source Software (FLOSS) communities. Further work by Heemsbergen et al. [-@heemsbergenIntroductionAlgorithmicAntagonisms2022] describe more practical ways in which these resistances are enacted. They introduce "algorithmic antagonisms", which are "media practices that tactically subvert, manipulate, or obviate extant power relations". With the antagonistic components stemming from the work of Chantal Mouffe [-@mouffeReturnPolitical1997], this type of activism has a direct relation to Tactical Media, an umbrella term for digital forms of activism such as algorithmic antagonisms. Algorithmic antagonisms are characterised firstly by their disruptive aesthetics, whereby they may attempt to make invisible structures visible. Secondly, by amplification, where they may amplify computational ways to "navigate, cope with, and compete in a computationally saturated social world" [@heemsbergenIntroductionAlgorithmicAntagonisms2022]. Thirdly, by queering data, and challenging normative structures of data and algorithms (see the Zach Blas example below).
Considering this review's focus on permacomputing, which has multiple motivations, but is lead by ecological concerns, it continues with a focus on resistance-motivated disconnection. Disconnections or refusals can happen on an individual or a collective basis. Notable individual resistance-motivated disconnection techniques involve data obfuscation projects, such as wearable 'smart assistant' microphone jamming jewelry [@chenWearableMicrophoneJamming2020] or artist image production protection tool Glaze, which introduces "barely perceptible perturbations to images" to poison Generative AI models that copy and mimic without accreditation or license [@shanGlazeProtectingArtists2023]. Collective forms of resistance-motivated disconnection involve more extensive non-participation and the fostering of community-managed spaces, such as Mastodon, a federated alternative medias, or Free Libre Open Source Software (FLOSS) communities. Further work by Heemsbergen et al. [-@heemsbergenIntroductionAlgorithmicAntagonisms2022] describe more practical ways in which these resistances are enacted. They introduce "algorithmic antagonisms", which are "media practices that tactically subvert, manipulate, or obviate extant power relations". With the antagonistic components stemming from the work of Chantal Mouffe [-@mouffeReturnPolitical1997], this type of activism has a direct relation to Tactical Media, an umbrella term for digital forms of activism such as algorithmic antagonisms. Algorithmic antagonisms are characterised firstly by their disruptive aesthetics, whereby they may attempt to make invisible structures visible. Secondly, by amplification, where they may amplify computational ways to "navigate, cope with, and compete in a computationally saturated social world" [@heemsbergenIntroductionAlgorithmicAntagonisms2022]. Thirdly, by queering data, and challenging normative structures of data and algorithms (see the Zach Blas example below).
A key piece of literature that speaks to digital activism, within tactical media and antagonistic resistances, is the book "Algorithms of Resistance; The Everyday Fight against Platform Power" by Tiziano Bonini, Emiliano Treré [-@boniniAlgorithmsResistanceEveryday2024]. Based in long term case studies, they build a list of forms of resistance and contestations of the pervasive influence of algorithms in daily life. The book shows how people resist platform-driven exploitation, looking at people who need to use platforms "like Airbnb or Deliveroo to work; those working in the culture industry relying on platforms; and those trying to generate political activity through platforms" [@oloughlinBookReviewAlgorithms2025]. The types of resistances that they find include: obfuscation (hiding or distorting personal data to disrupt surveillance capitalism), 'Gaming the System' (exploiting algorithmic loopholes for economic or social advantage), 'Alternative Platforms' (developing cooperative, user-driven platforms as alternatives to corporate tech), amplification & sabotage (intentionally boosting or disrupting certain content flows to influence algorithmic outputs). Beyond the much more interesting forms of resistance that are shown here, when compared to hashtag activism, the authors propose the notion of Algorithmic Agency, to describe a number of "algorithmic gaming practices" [@boniniLivingAlgorithmsPower2024 p.19] to make platform algorithms work to users' own advantage. "Algorithmic agency is the reflexive ability of humans to exercise power over the “outcome” of an algorithm. However, this agency is symbiotically embedded in the environment in which it is exercised; people exercise their agency while acting upon certain algorithmic outputs and, at the same time, by reacting to them. This symbiotic relationship happens within the boundaries of the affordances of algorithmic infrastructures" [@boniniLivingAlgorithmsPower2024 p.20]. With the situation of this book on algorithmic resistance within this review on digital activism, two observations of gaps or issues in the literature result: digital activism and resistance is frequently spoken in terms of 'platforms' and 'algorithms', but neither of these seem to have clear boundaries. In a recent review of four books focused on 'platforms', O'Loughlin [-@oloughlinBookReviewAlgorithms2025] says "Across all books I remain unsure, ontologically, what a platform is. Terrain, tool, battlefield, economic profit-machine, benign marketplace, cultural arbiter—all of these, and more?" (ibid.) The issue with nebulous terms like 'platforms' is then that literature can appear to be unspecific, or indirect. Even more nebulous is the term 'algorithm' which lots of digital activism literature focuses on, but even if sometimes unpacked, continues to be a 'catch-all' term that refuses to engage with the technicality of what, how and why 'algorithms' do what they do. Although algorithmic resistances are much more involved than hashtag activism, this is still not the radical 'resistance-based motivation' that I hoped to find in the literature. Yes, coming to an understanding of how to 'game algorithms' is indeed a form oppositional knowledge, but does inform the alternative media elements of the RQs yet. For this I turn to activist and artistic projects of alternative media, that do not register or exist within platforms.
A key piece of literature that speaks to digital activism, within tactical media and antagonistic resistances, is the book "Algorithms of Resistance; The Everyday Fight against Platform Power" by Tiziano Bonini, Emiliano Treré [-@boniniAlgorithmsResistanceEveryday2024]. Based in long term case studies, they build a list of forms of resistance and contestations of the pervasive influence of algorithms in daily life. The book shows how people resist platform-driven exploitation, looking at people who need to use platforms "like Airbnb or Deliveroo to work; those working in the culture industry relying on platforms; and those trying to generate political activity through platforms" [@oloughlinBookReviewAlgorithms2025]. The types of resistances that they find include: obfuscation (hiding or distorting personal data to disrupt surveillance capitalism), 'Gaming the System' (exploiting algorithmic loopholes for economic or social advantage), 'Alternative Platforms' (developing cooperative, user-driven platforms as alternatives to corporate tech), amplification & sabotage (intentionally boosting or disrupting certain content flows to influence algorithmic outputs). Beyond the much more interesting forms of resistance that are shown here, when compared to hashtag activism, the authors propose the notion of Algorithmic Agency, to describe a number of "algorithmic gaming practices" [@boniniLivingAlgorithmsPower2024 p.19] to make platform algorithms work to users' own advantage. "Algorithmic agency is the reflexive ability of humans to exercise power over the “outcome” of an algorithm. However, this agency is symbiotically embedded in the environment in which it is exercised; people exercise their agency while acting upon certain algorithmic outputs and, at the same time, by reacting to them. This symbiotic relationship happens within the boundaries of the affordances of algorithmic infrastructures" [@boniniLivingAlgorithmsPower2024 p.20]. With the situation of this book on algorithmic resistance within this review on digital activism, two observations of gaps or issues in the literature result: digital activism and resistance is frequently spoken in terms of 'platforms' and 'algorithms', but neither of these seem to have clear boundaries. In a recent review of four books focused on 'platforms', O'Loughlin [-@oloughlinBookReviewAlgorithms2025] says "Across all books I remain unsure, ontologically, what a platform is. Terrain, tool, battlefield, economic profit-machine, benign marketplace, cultural arbiter—all of these, and more?" (ibid.) The issue with nebulous terms like 'platforms' is then that literature can appear to be unspecific, or indirect. Even more nebulous is the term 'algorithm' which lots of digital activism literature focuses on, but even if sometimes unpacked, continues to be a 'catch-all' term that refuses to engage with the technicality of what, how and why 'algorithms' do what they do. Although algorithmic resistances are much more involved than hashtag activism, this is still not the radical 'resistance-based motivation' that I hoped to find in the literature. Yes, coming to an understanding of how to 'game algorithms' is indeed a form oppositional knowledge, but does not inform the alternative media elements of the RQs yet. For this I turn to activist and artistic projects of alternative media, that do not register or exist within platforms.
The next examples are not from traditional academic literature, so are included here to highlight a gap in said literature on digital activism, away from platform/algorithm paradigms of above. These continue to inform the RQs and make proposals of how to think and achieve oppositional action. Beginning by engaging with ideas involving diagrams and paranodes, then moving towards obfuscation, non-existance, opacity or contra-internet. After this, this section of the review finishes with considerations of more practical and applied literature, on refusal and alternatives.
The next examples do not stem from traditional academic literature, so are included here to highlight a gap in said literature on digital activism, away from platform/algorithm paradigms of above. These continue to inform the RQs and make proposals of how to think and achieve oppositional action. Beginning by engaging with ideas involving diagrams and paranodes, then moving towards obfuscation, non-existance, opacity and "contra-internet". After this, this section of the review finishes with considerations of more practical and applied literature, on refusal and alternatives.
The idea of the paranode is diagrammatic. It refers to the spaces (and practices) that exist outside of or resist the dominant networks and systems of control, like those of algorithmic platforms and technologies. The term suggests that not everything is or should be connected or visible within these systems. Paranodes represent the margins or gaps where alternative ways of being and knowledge can flourish, challenging the idea that total connectivity is inherently good or neutral, and pointing to the value of disconnection or invisibility as forms of resistance. The paranode idea comes from Mejias [-@mejiasNetworkDisruptingDigital2013 Chapter 9] but is bound, together with the next references, in an artistic project by Zach Blas [-@blasContraInternetJournal742016] called 'contra-internet'. It is an imagination of the alternates of the "internet's growing hegemony" towards "future forms of resistance" [@broomeZachBlasContraInternet]. The artworks of 'contra-internet' are responses to political events of digital forms of repression, such as the state-sponsored shutdown of the internet in Egypt in 2011, or the pro-democracy protests in Hong-Kong in 2014. Blas' art take multiple forms including writings, sculptures and film, and engage with postcapitalist politics, 'contrasexual' diagramatic interpretations (queering digital matter as outlined by [@heemsbergenIntroductionAlgorithmicAntagonisms2022]) and paranodes. Among the intertwined productions and inspirations of 'contra-internet' are two further relevant literary contributions that help situate oppositional knowledge, resistance-motivated disconnections and digital activism; these are Édouard Glissant's concept of the "right to opacity," as articulated in his work Poetics of Relation [@glissantPoeticsRelation2009], and Thacker and Galloway's [-@gallowayExploitTheoryNetworks2007 p.135] Tactics of Nonexistence. Relating to the decolonial subjects of the first part of this review, Glissant's poetry is considered to be postcolonial as it is highly critical of colonial legacies, but also offers a constructive vision for decolonized and pluralistic futures. Of specific interest are the elements of Glissant's works that focus on opacity and the rights of marginalised groups to modulate their opacity towards oppressors. These are both decolonial resistance and tactical stances. In the context of media activism, Glissant speaks for the use of the Creole language and artistic expression, with degrees of opacity as a way of resisting cultural erasure and as a form of tactical and practical political action. Thacker and Galloway's "tactics of nonexistence" are explorations of strategies that individuals or groups might employ to resist or evade the pervasive control of mechanisms within networks, such as 'platforms' or 'algorithms'. While more of a meditation on the meanings of nonexistance, this part of the book involves methods of becoming undetectable or untraceable within the network, to subvert the protocols of control. These tactics can included data obfuscation, anonymity and the creation of noise to disrupt surveillance and data collection efforts.
The idea of the paranode is diagrammatic. It refers to the spaces (and practices) that exist outside of or resist the dominant networks and systems of control, like those of algorithmic platforms and technologies. The term suggests that not everything is or should be connected or visible within these systems. Paranodes represent the margins or gaps where alternative ways of being and knowledge can flourish, challenging the idea that total connectivity is inherently good or neutral, and pointing to the value of disconnection or invisibility as forms of resistance. The paranode idea comes from Mejias [-@mejiasNetworkDisruptingDigital2013 Chapter 9] but is bound, together with the next references, in an artistic project by Zach Blas [-@blasContraInternetJournal742016] called 'contra-internet'. It produced multiple artworks and culminated in an exhibition of the same title. Contra-internet is an imagination of the alternates of the "internet's growing hegemony" towards "future forms of resistance" [@broomeZachBlasContraInternet]. The artworks of 'contra-internet' are responses to political events of digital forms of repression, such as the state-sponsored shutdown of the internet in Egypt in 2011, or the pro-democracy protests in Hong-Kong in 2014. Blas' art take multiple forms including writings, sculptures and film, and engage with postcapitalist politics, 'contrasexual' diagrammatic interpretations (queering digital matter as outlined by [@heemsbergenIntroductionAlgorithmicAntagonisms2022]) and paranodes. Among the intertwined productions and inspirations of 'contra-internet' are two further relevant literary contributions that help situate oppositional knowledge, resistance-motivated disconnections and digital activism; these are Édouard Glissant's concept of the "right to opacity," as articulated in his work Poetics of Relation [@glissantPoeticsRelation2009], and Thacker and Galloway's [-@gallowayExploitTheoryNetworks2007 p.135] Tactics of Nonexistence. Relating to the decolonial subjects of the first part of this review, Glissant's poetry is considered to be postcolonial as it is highly critical of colonial legacies, but also offers a constructive vision for decolonized and pluralistic futures. Of specific interest are the elements of Glissant's works that focus on opacity and the rights of marginalised groups to modulate their opacity / visibility to oppressors. These are both decolonial resistance and tactical stances. In the context of media activism, Glissant speaks for the use of the Creole language and artistic expression, with degrees of opacity as a way of resisting cultural erasure and as a form of tactical and practical political action. Thacker and Galloway's "tactics of nonexistence" are explorations of strategies that individuals or groups might employ to resist or evade the pervasive control of mechanisms within networks, such as 'platforms' or 'algorithms'. While more of a meditation on the meanings of nonexistance, this part of their book (called "The Exploit") involves methods of becoming undetectable or untraceable within the network, to subvert the protocols of control. These tactics can include data obfuscation, anonymity and the creation of noise to disrupt surveillance and data collection efforts.
An article by de Valk [-@devalkRefusingBurdenComputation2021a] puts refusal as the foremost action of resistance, in the context of ever expansive computational systems, and their ecological impact. The article involves multiple subjects of this dissertation and is highly relevant in the permacomputing context as well as the digital activism context. It highlights the practice of 'edge-computing' which is a form of reduction of network traffic and latency whereby vendors of high-bandwidth software services (such as video steaming software or video conferencing software) offload certain computational tasks to client (the users computer). As technical and or benin as this may seem, de Valk argues that this is an issue of a 'technological fix' that allows for 'business as usual' all while shifting the environmental responsability of software onto individuals. de Valk also highlights concerns about vendor lock-in, which can force unsustainable computational practices, by forcing users to replace hardware that is still functional. Ultimately, the argument is for refusal to comply in systems that enable 'business as usual', by simply refusing to use. The inclusion of this article in the review is a slight departure from the more ethnographic observations of digital activism cited above, but it is a necessary addition as it puts digital activism methods of refusal in direct dialogue with the technical and practical issues that are forgone when digital activism analysis stops at the 'platform' and 'algorithm' terms. Considering the RQs and the intended aims of this study, these refusals are the specific types of resistance-motivated disconnections I seek, based on oppositional knowledge and permacomputing principles.
An article by de Valk [-@devalkRefusingBurdenComputation2021a] puts refusal as the foremost action of resistance, in the context of ever expansive computational systems, and their ecological impact. The article involves multiple subjects of this dissertation and is highly relevant in the permacomputing context as well as the digital activism context. It highlights the practice of 'edge-computing' which is a form of reduction of network traffic and latency whereby vendors of high-bandwidth software services (such as video steaming software or video conferencing software) offload certain computational tasks to the client (the users computer). As technical and or benign as this may seem, de Valk argues that this is an issue of a 'technological fix' that allows for 'business as usual' all while shifting the environmental responsability of software onto individuals. de Valk also highlights concerns about vendor lock-in, which can force unsustainable computational practices, by forcing users to replace hardware that is still functional. Ultimately, the argument is for refusal to comply in systems that enable 'business as usual', by simply refusing to use. The inclusion of this article in the review is a slight departure from the more ethnographic observations of digital activism cited above, but it is a necessary addition as it puts digital activism methods of refusal in direct dialogue with the technical and practical issues that are forgone when digital activism analysis stops at the 'platform' and 'algorithm' terms. Considering the RQs and the intended aims of this study, these refusals are the specific types of resistance-motivated disconnections I seek, based on oppositional knowledge and permacomputing principles.
Mention the editorial of APRJA Volume 10 "How To Refuse Research from The Ruins of Its Own Production" and 2021 transmediale "for refusal" ? tmdl: Initially framed through friction, scale, and entanglement, the festival later addressed belief and compromise, recognizing the limitations and contradictions within acts of refusal. Refusal is neither passive nor simple—it demands alternatives and risks, shaped by friction and compromise. The festival framed refusal as a dynamic and necessary tool for political and social transformation.
QUESTION: Mention the editorial of APRJA Volume 10 "How To Refuse Research from The Ruins of Its Own Production" and 2021 transmediale "for refusal" ? tmdl: Initially framed through friction, scale, and entanglement, the festival later addressed belief and compromise, recognizing the limitations and contradictions within acts of refusal. Refusal is neither passive nor simple—it demands alternatives and risks, shaped by friction and compromise. The festival framed refusal as a dynamic and necessary tool for political and social transformation.
Finally, as mentioned in the first part of the review, Couldry and Mejias' [-@mejiasDataGrabNew2024] book indicates 'how to fight back', in Chapter 6, which offers a 'playbook' of strategies for resistance. These are: Play #1 Working Within the System, which involves using laws, regulations and institutions to push for change. Play #2 is Working Against the System, which is about direct oppostion to data colonialism, by boycotting extractive platforms; supporting unionization efforts, whistleblowing, and supporting activism movement like NoTechForApartheid. Then, Play #3 Working beyond the system, reimagines technology outside of Big Tech controls. This might look like supporting local and indegenous communities to develop their own digital infrastructure, and building models of convivial data, that put community needs over profit.
......@@ -112,34 +112,29 @@ Thirdly, Critical edTech is the area of literature that researches the use of te
The conversation in the Critical edTech space has been progressing from initial stances, where critical analysis of the impacts of technology on education, towards secondary stances, where edTech being progressively problematised [@selwynDigitalDegrowthRadically2023] with contemporary critiques proposing forms of refusal. In this section I start by employing an existing outline (provided by Clarke [-@clarkConstructionLegitimacyCritical2023]) of literature to explain some initial critical stances. In the following section I look at the secondary stances and detail some much more radical critiques of edTech. Key literature in this, secondary stance, more radically critical space is offered by Neil Selwyn [-@selwynDigitalDegrowthRadically2023], particularly with later work focused on digital degrowth and sustainable education technology, along with Audrey Watters [-@wattersTeachingMachines2021] and Ben Williamson [-@williamsonBigDataEducation2017], who are edTech critiques who widen the scope to consider infrastructural, economical, ecological and political issues related to edTech.
Reviews of Critical edTech research "critical studies into the nature, use and effects of educational technology" such as the one provided by Clark [-@clarkConstructionLegitimacyCritical2023] map out multiple problematic areas resulting from the pervasion of technology in education. Clark's short review firstly cites the impact on pedagogy, where [@koehlerTechnologicalPedagogicalContent2014] et al. detail how technology reshapes traditional teaching methods, seeking more interactive, or individualised (personalised) learning, because digital technology enables this, whether it matches up with educational goals or not. Their analysis and contribution comes in the form of a framework for assessment of knowledge needed for teachers to achieve "effective technology integration".[@decuypereIntroductionCriticalStudies2021] et al. provide another critical view of the impact of technology on education by showing on how digital education platforms are reshaping education. They employ an analytical "critical platform gaze" that looks at edTech platforms as connective artefacts of socio-technical assemblages. Their findings highlight the commodification of education when profit driven services try to solve educational goals as well as the risks of the standardisation of education. They argue that digital education platforms can promote a neoliberal vision of education where individualisation and market-oriented goals are prioritised. And [@gallagherOntologicalTransparencyInvisibility2021] highlight the ideas of 'Hidden Curricula' and lack of ontological transparency whereby important aspects of the learning process. By using critical pedagogy methodologies they reveal issues that are in contradiction of education ideals, such as power structures, biases and corporate influence are often obscured by edTech platforms.
Reviews of Critical edTech research "critical studies into the nature, use and effects of educational technology" such as the one provided by Clark [-@clarkConstructionLegitimacyCritical2023] map out multiple problematic areas resulting from the pervasion of technology in education. Clark's short review firstly cites the impact on pedagogy, where [@koehlerTechnologicalPedagogicalContent2014] et al. detail how technology reshapes traditional teaching methods, seeking more interactive, or individualised (personalised) learning, because digital technology enables this, whether it matches up with educational goals or not. Their analysis and contribution comes in the form of a framework for assessment of knowledge needed for teachers to achieve "effective technology integration".[@decuypereIntroductionCriticalStudies2021] et al. provide another critical view of the impact of technology on education by showing on how digital education platforms are reshaping education. They employ an analytical "critical platform gaze" that looks at edTech platforms as connective artifacts of socio-technical assemblages. Their findings highlight the commodification of education when profit driven services try to solve educational goals as well as the risks of the standardisation of education. They argue that digital education platforms can promote a neoliberal vision of education where individualisation and market-oriented goals are prioritised. And [@gallagherOntologicalTransparencyInvisibility2021] highlight the ideas of 'Hidden Curricula' and lack of ontological transparency whereby important aspects of the learning process. By using critical pedagogy methodologies they reveal issues that are in contradiction of education ideals, such as power structures, biases and corporate influence are often obscured by edTech platforms.
Second cited is the concerns surrounding datafication, the collection and use of data, as a way of quantifying educational spaces at "all levels of educational systems (individual, classroom, school, region, state, internation), potentially about all processes of teaching, leaning and school management."
[@jarkeEditorialDataficationEducation2019] Employing Berlant's notion of cruel optimism, MacGilchrist [-@macgilchristCruelOptimismEdtech2019] uses enthographic interviews to show how the optimistic goals of edTech (promoting equity, improving educational outcomes) are paradoxically hindered by oversimplifying complex issues, to data, commodifying that data and therefore shifting responsibility away or distracting from systemic change with a solutionist rhetoric. In a specific example of data use in education, Williamson [-@williamsonBringingBiodatafiedChild2020] examines the intersection of computational biology and education to highlight the social and ethical implications of data-intensive (bioinformatics) in education. This is done via critical analysis of existing literature and case studies. Then Witzenberger et al. [-@witzenbergerWhyEdTechAlways2021] show how "data used at the heart of pre-emptive EdTech seeks to exclude students and configures them as absent".
Third is the issue of the reinforcement of inequalities, which is the primary concern of the book by Rafalow [-@rafalowDigitalDivisionsHow2020], which conducted its study in three different types of schools: a private, predominantly white shool, a middle-class, predominamtly Asian American school, and a third predominantly Latinx attended school. Among the findings, it appears that the learner population seems to know more about technology than the adults do, yet "the stereotypes teachers adopt for their students of color depend both on racial stereotypes in the larger society and on the staff culture of the school" [@warikooDigitalDivisionsHow2023]. These stereotypes end up affecting Rafalow's area of focus, which is teachers' response to "students' creative, non-academic usage of technology for fun" (ibid.). Another example of the reinforcement of inequalities enabled by edTech tools is shown by Swauger [-@swaugerOurBodiesEncoded2020] in a piece that analyses proctoring software. This type of tool exists as an outsourcing "of the human-proctoring [model to] algorithmic proctoring, sometimes called 'automated proctoring'" (ibid.). Software such as Proctorio, Respondus, ProctorU, HonorLock and Examity record video and audio of a learner taking a test, to assess if any suspicious behaviour has taken place. Unsurprisingly, Swauger finds that these tools, flag more 'suspicious' (often incorrect) attitudes when observing learners of marginalised communities. This is because of the biased and nonrepresentational training of their machine learning algorithms, "reinforce oppressive social relationships and inflict a form of data violence upon students" (ibid.). In their article on "Dashboard stories" Jarke and Macgilchrist [-@jarkeDashboardStoriesHow2021] focus on "data dashboards of learning support systems which are based on Machine Learning". Using case studies of a "leading predictive analysis system" the authors examine how causality is implied between learning system usage facts that are being visualised. This dashboard overview ends up simplifiying the realities of learning and social structures to data points, which can be misunderstood or misread by the tool and the human teaching supervisor. Some of these misunderstandings disproportionately affect certain student populations, and the authors call for a critical assessment of predictive analytics in education.
Third is the issue of the reinforcement of inequalities, which is the primary concern of the book by Rafalow [-@rafalowDigitalDivisionsHow2020], which conducted its study in three different types of schools: a private, predominantly white school, a middle-class, predominamtly Asian American school, and a third predominantly Latinx attended school. Among the findings, it appears that the learner population seems to know more about technology than the adults do, yet "the stereotypes teachers adopt for their students of color depend both on racial stereotypes in the larger society and on the staff culture of the school" [@warikooDigitalDivisionsHow2023]. These stereotypes end up affecting Rafalow's area of focus, which is teachers' response to "students' creative, non-academic usage of technology for fun" (ibid.). Another example of the reinforcement of inequalities enabled by edTech tools is shown by Swauger [-@swaugerOurBodiesEncoded2020] in a piece that analyses proctoring software. This type of tool exists as an outsourcing "of the human-proctoring [model to] algorithmic proctoring, sometimes called 'automated proctoring'" (ibid.). Software such as Proctorio, Respondus, ProctorU, HonorLock and Examity record video and audio of a learner taking a test, to assess if any suspicious behaviour has taken place. Unsurprisingly, Swauger finds that these tools, flag more 'suspicious' (often incorrect) attitudes when observing learners of marginalised communities. This is because of the biased and nonrepresentational training of their machine learning algorithms, "reinforce oppressive social relationships and inflict a form of data violence upon students" (ibid.). In their article on "Dashboard stories" Jarke and Macgilchrist [-@jarkeDashboardStoriesHow2021] focus on "data dashboards of learning support systems which are based on Machine Learning". Using case studies of a "leading predictive analysis system" the authors examine how causality is implied between learning system usage facts that are being visualised. This dashboard overview ends up simplifying the realities of learning and social structures to data points, which can be misunderstood or misread by the tool and the human teaching supervisor. Some of these misunderstandings disproportionately affect certain student populations, and the authors call for a critical assessment of predictive analytics in education.
The fourth area of critical edTech studies is assetization and rentiership, which is detailed in Birch's development of a theory of Rentiership for Technoscientific Capitalism [-@birchTechnoscienceRentTheory2020].Assetization is explained as the process that transforms various entities such as infrastructure, data, online interactions and most anything that can be digitally captured into assets or capitalized property. Rentiership subsequently appropriates these assets through, for example, intellectual property mechanisms, and rents them out for any interested parties. These two processes of capture and capitalisation are clearly echoed in the first part of this review, and possibly best understood with Mejias' and Couldry's [-@mejiasDataGrabNew2024] title 'Data Grab'. The clearest example of this assetization and rentiership process is the practice of third-party data access, for systems like digital advertising. Birch's paper ultimately contributes to a better understanding of how capitalist 'technoscientific' economies are restructring around the control of assets and infrastructure for capture of future assets. Komljenovic [-@komljenovicRiseEducationRentiers2021] observes the effects of this in the education sector. The paper hightlights five implications of digital rentiership in education, one of which includes two forms of rents that are now common-place in education. The first is monetary rents where education institutions must pay increasing moneteary rents for access to digital platforms and services. Instead of purchasing books for libraries, schools must now purchase licenses to access publisher databases. Instead of spendings on laboratories, lecture halls, and campus learning infrastructures, institutions now pay for access to Virtual Learning Environments. The second is what is called 'data rents' [@sadowskiInternetLandlordsDigital2020] or 'indirect rents' [@langleyPlatformCapitalismIntermediation2017]. From a social and digital justice point of view, this is one of the highly problematic ones. "Data rents refers to the digital traces that students and staff leave behind when interacting with digital platforms". It refers to the data capture that happens subsequent to a learner or staff member using a learning digital platform. The data includes content (posts, discussions, interactions, uploads) as well as metadata (user location, time spent on platform, as well as any other identifiable metadata that might be valuable to third-parties). The result of the assetization and rentiership of digital systems in increasingly digitised learning environments is that learners pay (at least) twice for their access to education, once through their tuituion fees, and a second time when 'learning' platforms extract their data rent from their learning activities.
The fourth area of critical edTech studies is assetization and rentiership, which is detailed in Birch's development of a theory of Rentiership for Technoscientific Capitalism [-@birchTechnoscienceRentTheory2020].Assetization is explained as the process that transforms various entities such as infrastructure, data, online interactions and most anything that can be digitally captured into assets or capitalized property. Rentiership subsequently appropriates these assets through, for example, intellectual property mechanisms, and rents them out for any interested parties. These two processes of capture and capitalisation are clearly echoed in the first part of this review, and possibly best understood with Mejias' and Couldry's [-@mejiasDataGrabNew2024] title 'Data Grab'. The clearest example of this assetization and rentiership process is the practice of third-party data access, for systems like digital advertising. Birch's paper ultimately contributes to a better understanding of how capitalist 'technoscientific' economies are restructring around the control of assets and infrastructure for capture of future assets. Komljenovic [-@komljenovicRiseEducationRentiers2021] observes the effects of this in the education sector. The paper hightlights five implications of digital rentiership in education, one of which includes two forms of rents that are now common-place in education. The first is monetary rents where education institutions must pay increasing moneteary rents for access to digital platforms and services. Instead of purchasing books for libraries, schools must now purchase licenses to access publisher databases. Instead of spending resources on laboratories, lecture halls, and campus learning infrastructures, institutions now pay for access to Virtual Learning Environments. The second is what is called 'data rents' [@sadowskiInternetLandlordsDigital2020] or 'indirect rents' [@langleyPlatformCapitalismIntermediation2017]. From a social and digital justice point of view, this is one of the highly problematic ones. "Data rents refers to the digital traces that students and staff leave behind when interacting with digital platforms". It refers to the data capture that happens after a learner or staff member using a learning digital platform. The data includes content (posts, discussions, interactions, uploads) as well as metadata (user location, time spent on platforms, as well as any other identifiable metadata that might be valuable to third-parties). The result of the assetization and rentiership of digital systems in increasingly digitised learning environments is that learners pay (at least) twice for their access to education, once through their tuition fees, and a second time when 'learning' platforms extract their data rent from their learning activities.
platfomisation,
[@vandijckPlatformSociety2018]
Fifth is platfomisation, which has been discussed as a nebulous term in the digital activism section of the review, but in the context of critical edTech studies, in Chapter 6 of The Platform Society, José van Dijck et al. [-@vandijckPlatformSociety2018] show how digital platforms are reshaping the education sector. They examine the growing influence of private tech companies in educational institutions, highlighting challenges of platformization in education. Among the issues of platform presence in education are the tensions between public and private interests. This concern is raised by Singer [-@singerHowGoogleTook2017] and Constant [-@constantDearStudentTeacher2020] who focus on the unquestionable presence of Big Tech in education.
and algoritmication
[@selwynShouldRobotsReplace2019]
[@selwynEducationTechnologyKey2022]
[@williamsonPsychodataDisassemblingPsychological2021]
Lastly is algoritmication, which is detailed in the provocative book titled "Should Robots Replace Teachers2 [@selwynShouldRobotsReplace2019]. Central in the book is the algorithmication of education, referring to the increasing reliance on AI-driven systems for teaching, such as "lifelong 'learning companions', 'behind-the-scenes' technologies such as advice chat-bots or automated essay-marking, and the use of 'learning analytics' to mobilize data about student performance" [@sperlingerSHOULDROBOTSREPLACE2020]. In later writings, Selwyn [-@selwynEducationTechnologyKey2022] dedicates a full chapter of a book to Artificial Intelligence and Automation of Education and asks readers "readers to evaluate their own values in connection to technology" [@terasEducationTechnologyKey2022]. For a more applied critique, Williamson [-@williamsonPsychodataDisassemblingPsychological2021] looks at the issues of algorithmication in 'social-emotional learning' (SEL) which is being reshaped by data-driven technologies, AI, and algorithmic decision-making.
All the issues with edTech listed by Clarke's review testify of the development of a 'technological pessimism' a key position that will be further detailed in the theoretical framework chapter. The six areas of critical edTech detailed so far give aim to give an understanding of the larger state of literature, but it is of note that at time of writing, a new academic conference for Critical edTech studies [@eccesEuropeanConferenceCritical2024] has been established. This inaugural event has the aim of 'defining' the field of Critical edTech studies, which gives a sense of the emerging or novel nature of this literary area.
INCOMPLETE: A number of additional dimensions of critical edTech are worthy of consideration to establish a permacomputing contextual framework, including:
are all listed by this review, and testify of the development of a 'technological pessimism' a key position that will be further detailed in the theoretical framework chapter. --- The addition of 'Critical edTech' as a component to the contextual framework of permacomputing relies on an understanding that we live in a postdigital society. Cite Jack Webster, "updating digital citizenship"
More direct understandings of the critical components of critical edTech are outlined by Selwyn et al [-@selwynWhatsNextEdTech2020] where six 'challenges' are put forwards, for critical educational technology scholarship at the start of the decade.
Following this Macgilchrist [-@macgilchristWhatCriticalCritical2021] provides three responses to the question: what is critical in 'critical' studies of edtech. These provide ...
- Clarke's paper itself, on the construction of legitimacy, demonstrating the pervasive attitude of tech companies ensuring they keep a foothold in education for further assetization and rentiership
- The addition of 'Critical edTech' as a component to the contextual framework of permacomputing relies on an understanding that we live in a postdigital society. Webster [@websterUpdatingDigitalCitizenship2024], "updating digital citizenship".
- Finish with Macgilchrist [-@macgilchristWhatCriticalCritical2021] who provides three responses to the question: what is critical in 'critical' studies of edtech. These provide ...
QUESTION: conclusion needed?
[^biasproblems]: Examples of biased technologies: automatic soap dispensers distributing to white hands only, not Black hands, facial-recognition software issues where white faces are seen, but other skin-tones are not. Or the text-analysis software 'Grammarly' qualifying the writing of a Black man as 'angry' because he was talking about "Black food history in connection with Black resistance and Black joy" [@aliDecolonisingComputing2020] and the app automatically registering writings connects with these cultures as 'angry'.
\ No newline at end of file