Australia Must Lead AI Rules or Face Strategic Dependence

Artificial intelligence is being sold to us as the next miracle of statecraft, but here’s the hard truth: unless Australia shapes the rules of the game, we will end up as a passive technology taker, chained to systems we neither understand nor control. AI is not born neutral. The rules that govern how it behaves are set by its authors: corporations and governments embedding their own assumptions, priorities, and cultural defaults into code. Those rules travel with the system. If Australia adopts AI built to foreign specifications, we also import those embedded choices—about privacy, bias, autonomy, and control. We can avoid this dependency only by setting our own AI development guidelines, explicitly crafted to preserve sovereignty. That means writing the governance rules here, not just buying in systems and pretending we can retrofit “Australian values” later. The strategic choice is not whether to use AI, but whether to use someone else’s rules or our own.

If the standards are written offshore, every Australian AI-enabled system comes with an invisible leash. That leash is strategic dependence by another name. Australia is not the United States, nor is it China. We do not set the pace in hardware, nor can we outspend the giants in building fleets of drones, satellites, or algorithmic war machines. Our choice is narrower but sharper: either we invest intellectual capital in defining AI’s guardrails—ethics, governance, interoperability standards—or we accept whatever Washington or Beijing hands down. The polite word for that is “alignment”. The real word is dependence.

Defence planners love capability matrices. Build submarines, buy fighters, patch together the numbers and call it deterrence. But AI breaks that model. Capability is no longer measured just in tonnes of steel or lines of code. It is measured in who controls the decision loop—who defines what “trusted autonomy” actually means. Right now, Australia treats AI as another procurement line item. A widget. A force multiplier. But if the standards are written offshore, every Australian AI-enabled system comes with an invisible leash. That leash is strategic dependence by another name. Capability is no longer measured just in tonnes of steel or lines of code. It is measured in who controls the decision loop.

We have been here before. During the Vietnam War, we outsourced strategy to Washington and paid the price in blood. During Iraq, we followed blindly into a misadventure dressed up as inevitability. Even in the Cold War, we convinced ourselves that hosting someone else’s weapons made us a player rather than a pawn. History is clear: when Australia lets others set the rules, we end up carrying weight without carrying influence. AI is no different—except the consequences move faster, and the dependency cuts deeper.

Here’s the good news: Australia does not need to build an AI arsenal to matter. We need to set standards. Interoperability rules for defence AI. Transparency frameworks for government adoption. Ethical guardrails that stop AI being used to manipulate citizens or erode democracy. If we can codify those, and get others to sign on, we turn Australia into a standards setter. That’s power without brute force. It is the same logic that makes Geneva the keeper of humanitarian law, or Brussels the global rule-maker on privacy. Influence through governance.

The real question is brutal: do we shape the rules, or do we live under them? But let’s be honest about Canberra. Left to its own devices, the bureaucracy will do what it always does: commission a review, generate paperwork, and then quietly buy whatever the Americans are already using. It is the path of least resistance masquerading as pragmatism. That is not strategy. That is administrative cowardice.

The window is short. Standards are being written right now—in Washington, Brussels, Beijing. If Australia does not walk into the room with its own position, we are irrelevant before we even begin. The choice is simple: be a participant in shaping AI norms, or be a consumer of someone else’s strategic imagination. Once locked in, these standards will last decades. By the time the ink dries, our grandchildren will be saluting systems we never chose.

So let’s call it without euphemism. Australia’s AI choice is not about whether we “embrace innovation” or “seize opportunities”. That is the language of bureaucrats trying to look busy. The real question is brutal: do we shape the rules, or do we live under them? History suggests we will sleepwalk into dependency. But history also shows that when Australia chooses to think for itself—ANZUS, and other international agreements in their inception—we can punch above our weight. AI is not a side project. It is a strategic hinge. And middle powers do not get many chances to write themselves into the rules of the future. This is one.

Scroll to Top
×