Skip to main content

Reframing Failure in Digital Scholarship: Chapter 12. When optimization fails us

Reframing Failure in Digital Scholarship
Chapter 12. When optimization fails us
  • Show the following:

    Annotations
    Resources
  • Adjust appearance:

    Font
    Font style
    Color Scheme
    Light
    Dark
    Annotation contrast
    Low
    High
    Margins
  • Search within:
    • Notifications
    • Privacy
  • Project HomeReframing Failure in Digital Scholarship
  • Projects
  • Learn more about Manifold

Notes

table of contents
  1. Reframing Failure in Digital Scholarship
  2. Contents
  3. List of figures
  4. Notes on contributors
  5. Introduction: Reframing failure
  6. Part I: Innovation
  7. Chapter 1. Stop lying to yourself: collective delusion and Digital Humanities grant funding
  8. Chapter 2. Risk, failure and the assessment of innovative research
  9. Chapter 3. Innovation, tools, and ecology
  10. Chapter 4. Software at play
  11. Part II: Technology
  12. Chapter 5. Brokenness is social
  13. Chapter 6. A career in ruins? Accepting imperfection and celebrating failures in digital preservation and digital archaeology
  14. Chapter 7. Living well with brokenness in an inclusive research culture: what we can learn from failures and processes in a Digital Humanities lab
  15. Chapter 8. Can we be failing?
  16. Part III: Collaboration
  17. Chapter 9. Doing, failing, learning: understanding what didn’t work as a key research finding in action research
  18. Chapter 10. Navigating the challenges and opportunities of collaboration
  19. Chapter 11. Challenging the pipeline structure: a reflection on the organisational flow of interdisciplinary projects
  20. Chapter 12. When optimization fails us
  21. Chapter 13. Reframing ‘reframing’: A holistic approach to understanding failure
  22. Part IV: Institutions
  23. Chapter 14. Permission to experiment with literature as data and fail in the process
  24. Chapter 15. What to do with failure? (What does failure do?)
  25. Chapter 16. The remaining alternatives
  26. Chapter 17. Who fails and why? Understanding the systemic causes of failure within and beyond the Digital Humanities
  27. Chapter 18. Experimental publishing: Acknowledging, addressing, and embracing failure
  28. Chapter 19. Writing about research methods: sharing failure to support success
  29. Chapter 20. Bridging the distance: Confronting geographical failures in Digital Humanities conferences
  30. Conclusion: On failing

Chapter 12. When optimization fails us

Jentery Sayers

I’ve proposed that project. Okay, I’ve proposed quite a few of them: the ones that promise to deliver all the digital things. An app. An interactive map. An online exhibit. Data visualizations. A blog and social media presence. Open-access data repositories and peer-reviewed articles. Such promises reflect institutional demands on the humanities to remain relevant online while doing a lot with very little. Practitioners spread themselves and their projects thin whenever funding is scarce, and the norm in today’s content industry is to diversify the channels of communication to maximize circulation and reach everyone (Eichhorn 2022, 4-8). An app engages one audience, a journal article addresses another, a short video offers the tl;dr, and so it goes. Digital projects never rely on a single publication format, and their relevance is assessed by academic citation counts as well as internet performance metrics such as site visits, page views, clicks, downloads, subscriptions, and search engine rankings. Publishing across these two attention economies is a challenge, partly because citation and performance metrics cannot measure the quality of engagement. They are also plagued by assumptions that digital publishing is ⁠– or should be – fast and easy (Fitzpatrick 2011, 48, 191). An emphasis on the immediacy of digital content has, for instance, neglected matters of project maintenance and long-term care. Several projects I published in the early 2010s are inaccessible today. The map doesn’t load. The exhibit returns a 404 error. Data disappeared. Software was deprecated. I forgot to renew the domain.

The intent of this reflection is not to lament the rise of the content industry and its fetish for metrics. It is to briefly outline why management paradigms from that industry may not transfer well to the academy or, stronger, why academics may want to oppose or resist those paradigms. I am not implying that colleges and universities exist outside capitalism in a life-of-the-mind utopia. I am asserting that cooperative projects are among the most meaningful contributions practitioners can make to their scholarly fields. Although skeptics may claim that cooperative projects are bespoke and thus fail to scale, a collective awareness of content management paradigms helps practitioners avoid neoliberal principles of optimization that ultimately fail us. We can approach failure by focusing on substance: when an article or exhibit is not as compelling or thorough as we hoped it would be. We can also understand it in terms of promise and timing: when we miss due dates or opportunities to flip prototypes into products. Yet, a more generative framework underscores design: an attention to process and its structures, who defines failure, and how failure’s defined in the first place.

Some projects don’t have much of a plan; or, more precisely, the plan circulates mostly in the head of the principal investigator (PI). This paradigm echoes the auteur theory of film, and it’s feasible when someone works alone. For example, I maintain my own online portfolio, which I can update or fix when the mood strikes me. Opacity emerges when a single person – the ‘ideas person’ – oversees a project and hires research assistants, staff, and other team members. Trajectories can change rapidly, collaborators may be unsure what to do and when, and tensions likely arise between concept and practice. A team is expected to read the mind of the PI, who is also the primary stakeholder. Project management software doesn’t solve the problem, either. It’s conducive to busy work and PI detachment from the team. Worse, it becomes ‘bossware’ that lowers team morale and gives members the (perhaps accurate) impression that their activities are being monitored to optimize productivity (Corbyn 2022; Munn 2024). It’s not a stretch, then, to associate this PI model of leadership with a high risk for toxicity.

Waterfall management paradigms are meant to correct the leadership problems linked to the lone ideas person. A waterfall plan is prescriptive. It’s shared among the team, as are milestones for achieving it (Mokhtar and Khayyat 2002, 53). In the parlance of Game Studies, the project runs on ‘rails,’ like a train from origin to destination. Instructions are clear, and the vision and its trajectories should be, too; however, most everything is fixed to optimize time. Team members cannot be enticed by detours and must quickly resolve hiccups to maintain course. They must also refrain from iteratively releasing components of the project, meaning most public feedback is suspended until the entire project is complete (Thesig et al. 2021, 746-756). Even though waterfall management motivates teams to avoid scope creep and stick to their budgets, it often pushes researcher curiosity and experimentation to the margins. Such tunnel vision fails a team if it results in monotony, high turnover rates, burnout, or alienation from the project itself.

While the waterfall model privileges the project, the agile model foregrounds the process. It is flexible, thrives on iterative development, and adapts to feedback throughout a project’s lifecycle. The nonlinearity of agile methods appeals to practitioners who need to test multiple prototypes before settling on one, and prototyping can unfold across several publication formats. Indeed, the agile model may take its time, venturing off rails to pursue sidequests that weren’t mentioned in the initial plan. For these reasons and many more, it is adopted partly or fully by many digital humanities teams (Tabak 2017). The problem, though, is that the agile model appeals to users of products and thus to customers (Beck et al. 2001). This orientation means the quality and content of agile projects are easily determined by internet traffic and online performance metrics, with management optimizing attention and service provision rather than time or productivity. Agile gives as many users as possible what they want, tracks their engagement, and quantifies it for stakeholders or investors.

A cooperative model learns from these three approaches to identify when and why neoliberal management paradigms that prevail in industry fail us as academic researchers and practitioners. The cooperative model highlights the culture of a project, the relations within it, how it brings people together, and whether it alienates them from each other and the work they’re doing. One example is what Tiffany Chan and I call ‘minimal computing from the labour perspective,’ which does not reinvest a team’s surplus labour in increased productivity (Chan and Sayers 2022). It invests instead in shared structures, collective expertise, and common activities that require patience and extensive consultation. Another example appears in research by Liz Lane and Kristen R. Moore. They demonstrate how aspects of the agile model can be repurposed to foster a ‘multivocal critical imagination’ that encourages ‘collaboration and coalition building across disciplines to promote equity and justice’ in the academy and attends to issues that matter deeply to people’s communities and daily lives (Lane and Moore 2023, 43-44). Context matters most in both of these examples, where there is no one-size-fits-all approach to project design and management.

Against the maximization of productivity and attention, the cooperative model reveals that neoliberal paradigms of optimization strategically absorb the time, space, and mechanisms we require to interrogate the values of project management in today’s metrics-driven content industry (Nissenbaum 2005, lxvi-lxx). Slowing down the research process helps to counter that neoliberal tendency and so, too, does organizing to build projects, cultures, and teams that routinely remind people they can’t do all the things – and they don’t have to.

Bibliography

Beck, Kent, et al. 2011. ‘Manifesto for Agile Software Development’. https://agilemanifesto.org/.

Chan, Tiffany, and Jentery Sayers. 2022. ‘Minimal Computing from the Labor Perspective,’ Digital Humanities Quarterly 16 (2). http://www.digitalhumanities.org/dhq/vol/16/2/000600/000600.html.

Corbyn, Zoë. 2022. ‘’Bossware Is Coming for Almost Every Worker’: The Software You Might Not Realize Is Watching You,’ The Guardian. 27 April 2022. https://www.theguardian.com/technology/2022/apr/27/remote-work-software-home-surveillance-computer-monitoring-pandemic.

Eichhorn, Kate. 2022. Content. The MIT Press.

Fitzpatrick, Kathleen. 2011. Planned Obsolescence: Publishing, Technology, and the Future of the Academy. New York University Press.

Lane, Liz, and Kristen R. Moore. 2023. ‘The Invisible Work of Iterative Design in Addressing Design Injustices’. Technical Communication & Social Justice 1 (2): 28-48. https://techcommsocialjustice.org/index.php/tcsj/article/download/11/16.

Nissenbaum, Helen. 2005. ‘Values in Technical Design’. In Encyclopedia of Science, Technology, and Ethics, edited by Carl Mitcham. MacMillan.

Mokhtar, Renad, and Mashael Khayyat. 2022. ‘A Comparative Case Study of Waterfall and Agile Management’. SAR Journal 5 (1): 52-62. https://doi.org/10.18421/SAR51-07.

Munn, Luke. 2024. ‘More than Monitoring: Grappling with Bossware’. International Journal of Communication 18: 3128-3139. https://ijoc.org/index.php/ijoc/article/viewFile/21399/4663.

Tabak, Edin. 2017. ‘A Hybrid Model for Managing DH Projects’. Digital Humanities Quarterly 11 (1). http://digitalhumanities.org:8081/dhq/vol/11/1/000284/000284.html.

Thesig, Theo, Carsten Feldmann, and Martin Burchardt. 2021. ‘Agile Versus Waterfall Project Management: Decision Model for Selecting the Appropriate Approach to a Project’. Procedia Computer Science 181: 746-756. https://doi.org/10.1016/j.procs.2021.01.227.

Annotate

Next Chapter
Chapter 13. Reframing ‘reframing’: A holistic approach to understanding failure
PreviousNext
Pre-review version (January 2025)
Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International
Powered by Manifold Scholarship. Learn more at
Opens in new tab or windowmanifoldapp.org