by Wilson da Silva
American Reporter Correspondent
MELBOURNE, Australia – Need a divorce? Press enter. What began purely as a research project in Australia four years ago may well find its way to the desktop of divorce lawyers in the next few years: a computer program that mimics the thinking of a judge when deciding how to divide assets in a divorce.
Dubbed “Split Up,” it is the brainchild of Australian Ph.D. student Andrew Stranieri and Dr. John Zeleznikow of the Database Research Laboratory at La Trobe University in Melbourne.
Housed in a 386 laptop with only 4Mb of RAM and very mundane graphics, it throws up a series of questions for the user to answer: How many years have the couple been married? What is the husband’s salary? What is the value of the wife’s investments? How many children at primary school age?
Over a 15-minute period, it builds up a mental picture of the relationship, weighting answers or asking more detailed questions where required. When the program has enough information, it makes a determination on the likely outcome of an Australian Family Law Court case, dividing assets between the two partners.
How does it do this? Partly through analysis and partly through precedent. The software was actually developed inside a neural network, an artificial intelligence technique that splits a powerful computer into a number of independent “brain cells” or neurons.
Each neuron reacts to inputs, and delivers outputs, in a focused, task-specific way. When they are networked together and exposed to hypothetical problems, a remarkable thing happens: they learn from experience and appear to mimic human thought.
As with a human being, the neural net had to be taught the basics. Inside a Unix shell at La Trobe’s mainframe computer, the “Split Up” neural network went through a kind of Family Law 101: the statutes and rules that govern the Family Law Act in Australia were input and became the core of the program.
The neural net was introduced to 400 Family Law Court cases of the past three years. All the financial and personal information tabled at the trials was entered: age, kids, investments, assets, etc. Then the judgments were entered. It was asked to study the judgments, and go back over each case to learn how the rulings were arrived at.
Then Stranieri and Zeleznikow took a different and unusual tack: they asked the neural network – without a human agent – to assign different weights (or levels of importance) to particular factors in a marriage.
“We instructed the machine to learn the weights that those judges have given to each factor,” Stranieri told The American Reporter. “It learned how the human judges combined that data to produce a decision.”
In a sense, the neural net was asked to make decisions about what elements were more important in a case, and what factors deserved to have prominence over others. It created an inter-relational “tree” of facts and situations, and assigned numerical weights to each.
The results were intriguing: when the neural net was asked to rule on the facts of a case without knowing the real-life outcome, it was uncannily accurate. But there was one problem: it could not explain itself. While the decisions matched the real cases, the neural net had to be taught how to explain the reasoning behind its decisions.
Two lawyers – Renata Alexander of the Legal Aid Commission in Victoria state, and family law barrister Dr. Richard Ingleby – were then brought in to help the neural net explain, in legal terms, how it had arrived at its decisions. These were added into the system, and became part of the inter-relational database used by the network. The next innovation was to reduce and convert the operation of the neural network into something that could be run on the average IBM-compatible laptop or desktop.
“We had trained the neural networks on Unix, then stored every possible combination of the situations into the laptop,” said Stranieri. “When we’re actually running a case, we don’t need to run the network – which would slow the laptop right down. It just refers to the database.”
That was no mean feat: at any time, between 21 and 30 neural networks – with between 32 and 10 logic inputs and outputs each – were juggling the facts of each case in its electronic brain. The researchers had to convert this tree of inter-relational complexity into a database that could be quickly and easily accessed by the average computer.
The reaction has been surprising: ranging from the hostile to the enthusiastic.
“Some people who have looked at the system – who are either going through divorce themselves or were just interested in trying it out – they’ve said that they might be more willing to accept the advice from a machine than from a lawyer,” said Stranieri. “Because they can be sure that the machine is not biased.”
Some of the resistance was driven by the unjustified fear that artificial intelligence might replace lawyers and judges, but Zeleznikow sees the system as a potential boon to lawyers.
“They can trial-run a case, and in the reasonings get references to the precedents that were used to arrive at the decision,” he said. But he agreed that it might take some of the business away from lawyers, especially those prone to advising clients to proceed with costly divorce litigation when the chance of winning is low.
“When you go to see a lawyer in family law, he is going to encourage you to fight for what you can most get,” said Zeleznikow. In a simulation of a fictitious couple which the researchers ran for this reporter, “Split Up” gave 45 per cent of the assets of the marriage to the husband. “The lawyer might tell a client, let’s try for 60 per cent of the assets because you contributed more,” Zeleznikow said. “There is a small possibility – if a judge interprets things just the way you want it – that you might get 60 per cent.
What he doesn’t tell the client is that if it does go to court, there’s also a possibility he might get 30 percent. This, on the other hand, is going to be quick, with no emotional pain, and cheap.
“This might encourage more people to settle out of court, because they can get additional and unbiased advice about what might happen if they did go to court,” said Stranieri. “At the moment, that advice would be coming from their lawyers, who is an interested party.”
The researchers are now looking for commercial partners to refine the program and take it to market. In the meantime, they will be exposing the neural net to more cases, so it can build up its understanding of family law and what influences decisions.
“It’s very difficult to get the system exposed to as many cases as a real-life judge does,” said Stranieri. “The other problem is that a human exercising judgment always takes into account the facts of a particular case in a very human way. Sometimes a brand-new fact emerges, which the system has never been exposed to, and it might not know how to react, whereas a human does.”
On the other hand, “Split Up” would never be influenced by personal biases or bigotry.
“Judges may also make decisions on factors that are not necessarily relevant to the case, such as the appearance of the litigants,” said Zeleznikow. While it might be just and proper that such factors be kept out of the program, he said, “you should also be aware that it does occur in real life.”