The Slow Shoe Shuffle
Hear ye, Hear Ye,
They call it the bureaucrats battle-cry: What do we want? Incremental change! When do we want it? In due course!
When it comes to government action on technology - be it AI or privacy - it seems our Federal Government is taking this ethos seriously.
This week Industry Minister Ed Husic released the latest baby steps in the urgent battle to place regulatory guardrails around the introduction of AI.
On its face there is merit in the framework laid down, with the idea of mandatory guardrails from high-risk AI (definitions still to be determined) providing positive and enforceable obligations on those who deploy systems on their workers, customers and the public.
We are not there yet, but a world where AI is treated like workplace safety, with overriding obligations on businesses driving safe and responsible practices, seems better than a tick a box compliance framework which will always struggle to keep up with the tech.
But now it goes into (yet) another round of consultation that won’t see legislation until after the 2025 election which, while Labor’s to lose, is no certainty.
In the interim, industries will be inundated with product on the promise of saving costs, incumbents will become more entrenched and other jurisdictions like Europe and even the laissez-faire will see their own regimes mature.
The government, which still seems out of whack with a public who sees more risk than reward with these new technologies, seems enthralled by the shill of industry that this technology is a magic wand rather than a way of processing our collective wisdom.
The other point that seems to be missing is that these guardrails are not constraints, they are guides to help lead to better adaptation of this technology.
As we have shown in our work with UTS’s Human Technology Institute, the real value of machine learning is when it is deployed as a tool designed by workers, rather than a replacement for them. Nudging businesses to look at it in this way is the key to unlocking productivity.
Meanwhile, the Attorney General appears to have missed his own deadline of August to deliver the first meaningful privacy laws in 40 years, with rumours the structural reform to end tick a box consent and give people enforceable rights won’t happen in this term of government.
This means that much-touted measures like child safety, cybersecurity, Digital ID, regulation of RentTech and the response to Robodebt, which the same technocrats have said will be anchored by privacy reform will lack the basic foundations of data responsibility.
Maybe there is a long game that I’m missing here, but on these two crucial pieces of reform it seems the forces of self-interest and bureaucratic inertia have well and truly captured the agenda so that even incremental change seems an aspiration. In due course.
This week’s Burning Platforms: The NSFW Edition
What can Only Fans teach us about the internet? Social media academic Emily van der Nagel takes us through her research with content creators and consumers to reveal the logic a very different platform.
She joins our regular panel of Digital Rights Watch chair Lizzie O’Shea, Health Engine CEO Dan Stinton and Centre of the Public Square convenor Peter Lewis.
They also discuss:
· the arrest of Telegram CEO Pavel Durov
· musicians counting the cost of AI
· and what next for the News Media Bargaining Code?
You can listen to the episode here.
Policy Update:
Per Capita has submitted to the final ACCC inquiry into the implementation of the recommendations of the Digital Platforms Inquiry.
Our Director of Responsible Technology, Jordon Guiao, argues:
- Australia should, at a minimum, enact regulation that is on parity with or equivalent to significant global regulation like the Digital Markets Act (DMA), and the Digital Services Act (DSA) from the European Union (EU)
- That Australia considers taxing large digital platforms appropriately, ensuring they do not employ tax minimisation strategies while benefitting enormously from the Australian public and the Australian economy
- That we ensure consumer harms and the power imbalance from current digital platforms is not carried over to newer technologies like artificial intelligence (AI), and apply systemic reforms to achieve this
- That we clarify the government’s progress (or lack thereof) in enacting the recommendations from all the work of the Digital Platform Inquiry, and produce a clear checklist for the public and interested parties to help hold government to account
You can read the full submission here.
What we are clicking:
Analog Privilege: Tech Policy Press has published this thought-provoking piece from Maroussia Lévesque on how the option to opt out of the algorithm is becoming a preserve of the wealthy elites.
Join or Die: New Public does great work imagining non-corporate digital spaces and they promoted this interview with Pete Davis, a champion of grassroots community, last week on the importance of joining real groups in community as a critical piece in the effort to protect American democracy.
Storytime: I’d argue some of the best non-fiction writing at the moment is around technology: think Brian Merchant, Maria Farrell and my latest find Kyle Chakra whose book Filterworld is worth a read. Anyway, now there’s a call for fiction writers to join the vanguard, with an invitation for writers to challenge law enforcement propaganda. We will be clicking what comes out the other end