Recent advancements in artificial intelligence are helping organizations move faster and make more intelligent decisions. However, while organsations know AI can be good for business, they don’t always know exactly how it works.
Eighty-two percent of enterprises are interested in using AI, but 60 percent worry about liability issues while 63 percent don’t believe they have the proper in-house talent to manage it, IBM research shows.
IBM is releasing a new software service that will enable organisations not only to trust their AI systems to make decisions, but will provide insight into how it came to decisions and why. According to the company, this will automatically detect bias, provide greater understanding, simplify management, and make AI more transparent.
IBM Watson general manager Beth Smith said that IBM led the industry in establishing Trust and Transparency principles for the development of new AI technologies.
“It’s time to translate principles into practice. We are giving new transparency and control to the businesses who use AI and face the most potential risk from any flawed decision making.”
The Trust and Transparency service runs on the IBM Cloud and works with models built from such popular machine learning frameworks and AI environments as Watson, TensorFlow, SparkML, Sagemaker and Microsoft’s AzureML. Users also can customise the service’s software to better fit the specifics of their organisations.
IBM will release its AI Fairness 360 toolkit for the open-source community, which includes nine algorithms, code and three tutorials that will be available to data scientists, academics and researchers. More tools and tutorials will be added in the future.
The service offers to detect and remediate bias in your data and model deployments to ensure fair AI outcomes and allow AI outcomes to be explained to business users in terms they understand.
Big Blue said that the ongoing health of AI in business applications can be checked with operation dashboards, alerts, and open data mart access for custom reporting.