Public trust in government and the digital services it provides do not necessarily go hand in hand, particularly when it comes to artificial intelligence.
But AI strategist Justin Tauber says trust is possible and points to an example from the 1880s.
“It’s 1889. Electricity is a hot new business transforming society, transforming the economy,” he reminded attendees at an Amazon Web Services symposium for the public sector in Canberra this week.
“New Yorkers are getting very stressed about the proliferation of poles and wires running up all around the city.”
Back then, there was panic about the dangers of putting electricity wiring in the walls where it could not be seen. Like now, everything would eventually be electrified.
But trust had to be earned and people had to be educated about safety, such as not wiring up a bathtub for hot water.
“We learned how important it was to share that information with our children and to expect that of each other,” Mr Tauber said.
Now, governments taking up generative artificial intelligence, and AI in general, need to learn how to make it safe and trustworthy, he said.
Generative AI uses machine learning to produce text and images from enormous datasets inspired by the structure of the human brain.
It’s been used for years by music streaming apps and in technologies such as chatbots and image generators.
It hit the public consciousness when millions of people began using text generators such as ChatGPT to do homework and write reports, job applications or computer code.
As a senior executive at cloud software company Salesforce, Mr Tauber wants to provide the “nudges and scaffolding” to make sure AI is used intelligently and responsibly.
In getting ready for the next wave of technology, organisations need to develop ethical frameworks for data and analytics informed by legislation and best practice, according to Australian Taxation Office assistant commissioner of data insights Ben Taylor.
“It doesn’t matter whether you’re using Gen AI, whether you’re using big data, small data, Excel spreadsheets, or storing your stuff in filing cabinets,” he said.
But Mr Tauber warns that trust in digital government, where everything from the tax to welfare data is stored in big data buckets, must be earned.
A recent survey found 90 per cent of people agree that quality of service influences their trust in government data management.
Almost all (93 per cent) agreed government services should equal the best that private-sector organisations and global leaders can offer.
“The better the service, the more you feel like the government cares,” one respondent said.
For example, if an online service is unreliable, slow or hard to navigate, people think the systems are poor and data is insecure.
Mr Tauber said trust in digital public services can be earned through good experiences and clear information about why data is being shared and how its use is limited.
ATO assistant commissioner of enterprise data and analytics Fawab Abro said the tax office is “very risk-averse” about the safety of the taxpayer information it holds because it is some of the most valuable data in the nation.
“We’re going to need you to prove that it works, right, before we consider implementing any of your technology,” he told symposium attendees.
“Whether we should be training our own (Chat)GPTs for example – those are the conversations we have to have. We can’t shy away from them.”
AI technology could be vital for the tax office in spotting criminal behaviour and preventing theft or payment of refunds people aren’t entitled to.
But Mr Abro said a call centre operator would never be allowed to just generate a tax determination or anything that might have legal ramifications.
Rather, someone with legal expertise would still need to check and validate what was generated by the technology.
At the same time, Mr Abro said, “we’re not afraid to work with cutting-edge technology”.
This proved to be the case when the ATO, through Operation Protego, stopped billions of dollars of sham payments using sophisticated data intelligence and matching tools.
The crackdown on the biggest GST fraud in Australia’s history halted $2.5 billion in fraudulent refunds from being paid and resulted in action being taken against more than 53,000 entities.
The ATO also noted the difference between so-called enterprise AI that the tax office, banks and other critical services have used for decades, and generative AI, which is rapidly becoming available to everyone.
“We know what we’re doing with it, we know the appropriate use cases, we don’t apply it to the wrong things,” Mr Taylor said.
Mr Tauber said the upswell of generative AI will bring rising expectations for services more generally, as interactions become more personalised and seamless – because if ChatGPT can answer a question, why can’t a human?