Ensuring Fairness with Machine Learning and Automation
Analytical advances offer new possibilities for collections: automating operational decisions as well as informing strategy. However, the industry needs to ensure customers are treated fairly and the vulnerable are catered for in this new world.
In collections, as in any other sector using analytics, automation drives more widespread use of predictive analytics. It results in greater consistency and the elimination of non-value-adding ‘data wrangling’ efforts. The push towards ‘prescriptive analytics’ is now percolating down from strategic decision-making to the operational level. A future is in sight where we can fully automate and continuously optimise day-to-day decisions: providing systems that automatically tweak actions taken on a per-customer basis.
Systems will be able to optimise call times and frequencies, plus make decisions on outsourcing and the triggering of legal action. Using automation to decide who to call and collect from should substantially increase productivity. Of course, analytics – and predictive modelling in particular – is not without its dangers. Model bias and fairness is an important concern.
Our view is that preventing model bias due to past behaviour – such as making predictions that favour particular characteristics just because this was common practice in the past – is a technical issue and should be solved by technology. Systems should be able to consider all options for accounts, adapting to business rules and regulatory constraints. Experimentation will be necessary to ensure there is the data available to explore possibilities previously discounted.
However, ensuring fairness is a more general, societal issue that will need co-operation between technology, business and regulation. Since the financial crisis there has been a great deal of attention given to the journey customers take through collections processes and the treatment of vulnerable people. Systems need to be able to take account of vulnerabilities and special circumstances to avoid poor treatment.
As the world is coming to terms with, and increasingly adopting, machine learning, it’s also becoming more aware of its limitations and dangers, and this awareness is leading to a drive to circumscribe their use: predictive model fairness, usage tracking and handing some control to the data subjects are now serious issues that need to be faced in order to meet regulatory requirements without losing the benefits machine learning offers.
Analytics systems need transparent, automated ways to deal with such constraints. The collections industry is well-versed in dealing with regulation and treating customers fairly and will need to bring these skills to the fore. Machine learning and related analytics techniques can help in this endeavour. They can be proactive in identifying innovative ways to deal with vulnerable customers as well as reactive – proscribing actions that treat such customers unfairly. Regulation is obviously focusing on the latter, but the benefit to society from the former is just as important.
Automation can be used to nudge customers out of arrears early on in default, plus offer grace periods or debt reduction in appropriate cases. Treating customers fairly leads to greater likelihood of full repayment and the added benefit of more rehabilitated, loyal and profitable future customers.
Advanced technology reduces compliance and operational risks by eliminating the errors that typically arise from fragmented data and manual processes. Organisations need to ensure they keep pace in this rapidly-changing world.
At QUALCO we have these concerns foremost when designing our technology and products.