I Knew Them When They Were Just Starting Out

Hear Ye! Hear Ye!
I never thought I’d write these words but ‘my favourite economists have just won the Nobel Prize’. Daren Acemoglu and Simon Johnson were two of three economists awarded the gong for their work over many years looking at the role democratic institutions play in driving economic productivity.
To be honest, I only cut onto the wave just before it broke, totally inspired by last year’s Power and Progress via this review in Business Insider that my friend and collaborator Nick Davis from the Human Technology Institute (HTI) sent me that reeled me in.
The book is sprawling and visionary – much more the political economy on which I was raised, bringing to life the construction on the Suez Canal (and why it delivered where the Panama Canal didn’t), the rise of the printing press steam engine, and medieval weaponry.
The two are great storytellers, but with a hard economic edge based on a critical insight: the productivity our economic system craves can only be created when people have a stake in technology.
They describe how history shows that when technology merely automates and surveils workers it may lead to an accumulation of profit and wealth, but it will not render greater output from human capital, i.e people.
It is only when the technology is designed by workers to be useful, as a tool, a connection, a new application of existing skills or a new connection, that genuine value is created.
Since reading the book I’ve had the chance to test some of their theories on work that Nick and I partnered on, through the Invisible Bystanders Project during Nick's time at HTI. It showed how nurses, retail workers and public servants have the capacity to guide AI technology, if only anyone would make the effort to take them on the journey.
What’s so energising about their theory is that worker voice doesn’t become an accommodation or a claim, it lies at the heart of good economic theory. It is literally the key to unlocking the potential of AI but also the bulwark against uses that will diminish us.
As they argue the only people who can really do this are workers who are organised not just industrially but politically, which was the ultimate tipping point on turning the dystopia of the early industrial revolution into a driver of wealth and prosperity.
This bigger systemic project is sitting there in front of us. A vision of technology driven by workers. Four specific reforms would really embed this thinking:
- Industry Work Councils: To genuinely write the brief for technology, rather than have to deal with vendor-imposed products.
- Employer Liability: Look at workplace safety, pacing a general obligation on employers to introduce AI safely would force a level of care and engagement that would make the technology better.
- AI Safety Officers: Again like workplace safety, empowering workers to oversee the implementation of any new systems and processes with the right to halt new systems where they are dangerous would drive the sort of literacy that will enhance productivity.
- Baseline Standards including a minimum staff number: where in nursing, for instance, there are industrial agreements setting nurse:patient ratios these should be used as.
I’ve been banging on about these things for about six months now and while people seem interested it has been hard to gain traction – so maybe the Nobel Prize is a moment of validation where we can really get cracking to unlock the power of Workers Intelligence.
PS. For those that want a shortcut, check out one of my favourite podcasters The Gist’s Mike Pesca.
Burning Platforms – the Digital Dark Ages
As researchers are increasingly locked out of digital platforms, we contemplate a new era of control and secrecy, with responsible tech academic Gina Neff.
She joins our panel of Digital Rights Watch chair Lizzie O'Shea, Choice’s digital campaign lead Rafi Alam and me, Peter Lewis, from Per Capita’s Centre of the Public Square.
Also this week:
Policy updates:
There’s a fair bit of action going down as the first term of the Albanese Government approaches the finish line:
- Proposed misinformation laws seems destined to fail after a tough round of committee hearings where the fundamental flaws in the framing of these laws was exposed.
- Incremental privacy reform is moving, eh... incrementally.
- Non-binding but vaguely positive AI design standards have been recommended.
But there was positive news when the Joint Select Committee on Social Media and Australian Society handed down its second interim report last night, which is really a game plan to deal with Meta's decision to walk away from the News Media Bargaining Code.
Beyond the intricacies of that gumbo of conflicting powers is a really significant structural recommendation:
3.161 The committee recommends that the Australian Government establish a Digital Affairs Ministry with overarching responsibility for the coordination of regulation to address the challenges and risks presented by digital platforms. The Ministry could also play a role in coordinating monitoring and research activities to assess the ongoing impact of digital platforms on Australian society, as well as the effectiveness of existing and future regulation. Because matters relating to the regulation of social media are broad, the new Digital Affairs Ministry should be given an equally broad remit so that it can regulate matters such as, but not limited to, privacy and consumer protection, competition, online safety, and scams.
Creating a Ministry with responsibility for these inter-connected policies is critical to a coherent regulatory regime, as we pointed out in our submission to the inquiry right now responsibility shifts between the Treasurer (when he is not conflicted out), Attorney-General, and Minister for Communications.
This is one of the reasons reforms recommended by the ACCC's Digital Platforms Inquiry back in 2021 have moved so slowly, each sits in a silo with its own priorities and trade offs.
Having a single point of government responsibility would open the way for the grand bargain we need if we are to not just to sustain the news media, hold digital platforms to account but also set the best framework for the coming AI wave.
What we are clicking
The End Game - If you want to be really afraid about what Elon Musk stands to gain from the upcoming US election result, this long read from the Drop Site will make your jaw drop.
Distressed Assets – What happens when a DNA testing site goes bust? Your DNA becomes part of the fire sale, as Kristen Brown writes in The Atlantic.
Black Magic – Finally, our own Jordan Guiao in Innovation Aus arguing the case that until we can unlock the black box of neural networks then we will never be able to deem AI ‘safe.’