Can AI help casinos reduce problem gambling?

Feldman had spent the past 30 years as an executive at MGM Resorts International, focusing on problem gambling and the financial, personal and professional repercussions. Prior to leaving the company, he helped create a national responsible gambling program to help gamblers change their behaviors to reduce the risk of becoming problem gamblers.

While on the ICE floor, he noticed a few companies promoting new products that would use artificial intelligence not only to identify problem gambling, but also to predict it. Feldman was immediately skeptical. AI, he thought, could do a lot of things, but he had never heard of a use that predicted a mood.

AI as a solution to problem gambling “has raised many more questions than it has answered,” said Feldman, now a leading responsible gambling researcher at the University of Nevada’s International Gaming Institute. in Las Vegas. “It was slick. It was interesting. It was intellectually compelling. But I thought it was really going to do anything or not, a lot of that was still in question.

Another question, this one obvious to any observer: isn’t a problem gambler exactly what a casino wants financially? In short: no. Even setting regulatory issues aside – gambling operators can be fined or lose their license if they fail to monitor problem gambling and take action where necessary – it is, against expectation, not in their best financial interest.

“Casinos need to have customers to sustain themselves,” Feldman said. “And the only way to have customers is to have customers who are healthy and successful themselves and able to pay their bills and come back the next time.” Problem gamblers “always end up the same way”, he added. “The end of the road is exactly the same for each of them: they have no money.”

More generally, the couple AI and game takes on its full meaning: unlimited and constant data, decision-making, computerized systems. With the explosion of online gaming, the possibilities of harnessing this combination for the public good seem endless. The reality – interpreting human behavior, navigating privacy laws, solving regulatory issues – is much more complicated.

At the same time that Feldman was questioning these potential solutions, Danish researchers were trying to solve the same problems. Mindway AI, a company spun off from Aarhus University, does exactly what Feldman was skeptical about: it predicts future gambling problems. Built from research conducted at Aarhus University by its founder, Kim Mouridsen, the company uses psychologists to train AI algorithms to identify behaviors associated with problem gambling.

A significant challenge is that there is no single indicator of whether someone is a problem gambler, said Rasmus Kjærgaard, CEO of Mindway. And in most casinos, human detection of problem gambling focuses on just a few factors – primarily money spent and time played. Mindway’s system considers 14 different risks. These include money and time, but also canceled bank withdrawals, changes in the time of day the player is playing and erratic changes in bets. Each factor is given a score from 1 to 100, and the AI ​​then works out a risk rating of each player, improving with each hand of poker or spin of roulette. Players are marked from green (you’re doing well) to blood red (step out of the game immediately).

In order to adapt the algorithm to a new casino or online operator, Mindway sends its data to a group of experts and psychologists trained in the identification of such behaviors. (The company said they were independent, paid consultants.) They rate each customer’s customers and use that model as a kind of benchmark. The algorithm then replicates its diagnosis in the full customer database.

“As soon as a player profile or player behavior changes from green to yellow and the other stages as well, we are able to do something about it,” Kjærgaard said. The value of the program is not necessarily simply to identify these bloodthirsty compulsive gamblers; by monitoring jumps along Mindway’s color spectrum, it predicts and catches players as their game evolves. Currently, he said, online casinos and operators are focusing their attention on blood-red gamblers; with Mindway, they can identify players even before they reach that point.

The trickiest step, however, according to Brett Abarbanel, director of research at UNLV’s International Gaming Institute, is taking that data and explaining it to a player.

“If my algorithm flags someone and they think they’re a problem player, I’m not going to send them a note and say, ‘Hey, good news: my algorithm has identified you as potentially a problem player. You should stop playing right away!’” The answer would be obvious, said Abarbanel, fanning a middle finger: “That’s what’s going to happen.”

How to actually communicate this information – and what to tell the player – is an ongoing debate. Some online game companies use pop-up messages; others use SMS or email. Kjærgaard hopes customers will take his data and, depending on the level of risk, contact the player directly by phone; the specificity of the data, he said, helps personalize those calls.

Since its start in 2018, Mindway has outsourced its services to seven Danish operators, two in Germany and the Netherlands, a global operator and a US sports betting operator, Kjærgaard said. Online gaming giants Flutter Entertainment and Entain have also partnered with Mindway, according to the companies’ annual reports.

Since this technology is so new and there is no regulatory body setting a standard, Mindway and similar companies are, for now, essentially on their own. “We wanted to be able to tell you, anyone else – the operators, obviously – that not only are we providing this scientific software, but we also want a third party to test the validation of what we are doing,” says Kjærgaard. . “But it’s a paradox that there are no specific requirements that I can ask my team to meet.”

Currently, Mindway’s technology resides primarily in online gambling. Operators attach Mindway’s GameScanner system to their portal, and it analyzes not only individual risks but also total system risks. Applying this level of oversight to in-person gaming is much more difficult.

An example of measuring success can be found in Macau. Casino operators there use hidden cameras and facial recognition technology to track players’ betting behavior, as well as poker chips with radio frequency identification technology and sensors on baccarat tables. This data is then directed to a central database where a player’s performance is tracked and monitored for collusion between players.

This, said Kjærgaard, is the future: financial incentives will drive success. “Smart tables” and efforts to combat money laundering and financial regulations could eventually provide the data that will boost the application of AI to in-person gambling.

(It also highlights another difficulty in applying AI to gambling: cultural differences. In Chinese casinos, Abarbanel said, players are used to this level of surveillance; this is not the case. in the USA.)

AI would certainly work for casinos when it comes to marketing, promotions and game suggestions, Feldman said, but despite advances in recent years, he remains skeptical of its use in helping problem gamblers. Applying such a tool can be best used personally rather than broadly, he thinks, much like the “Your expenses are 25% higher than last month” reminders that pop up in online bank accounts.

“It’s a bit like drinking. Do you know anyone who hasn’t gotten drunk once in their life? That doesn’t mean they’re alcoholics,” he said. “But maybe one drink a night that kind of became one and a half, sometimes two, sometimes three – maybe you want to bring that in a bit. But you don’t want the bar to follow every record here , is not it ? “

©2022 The New York Times Company


Source link

Back To Top