Artificial Intelligence & Machine Learning , Next-Generation Technologies & Secure Development

Minimizing Automation Bias in Machine Learning

Microsoft's Diana Kelley Says Diversity Is Key Component for Resilient ML Models
Diana Kelley, cybersecurity field CTO, Microsoft

Developing robust and resilient machine learning models requires diversity in the teams working on the models as well as in the datasets used to train the models, says Diana Kelley of Microsoft.

See Also: Identity-Based Cyberattacks in the Age of AI Proliferation

”If you don’t understand the datasets that you are using properly, it’s a potential to automate bias,” she says.

In a video interview with Information Security Media Group at the recent RSA Conference 2019 in Singapore, Kelley discusses:

  • The APAC security landscape;
  • Automation bias in AI & ML;
  • The need for more diversity in ML teams and datasets.

Kelley is the cybersecurity field chief technology officer for Microsoft and a cybersecurity architect, executive adviser and author. She leverages her more than 25 years of cyber risk and security experience to provide advice and guidance to CSOs, CIOs and CISOs at some of the world’s largest companies. Previously, she was the global executive security adviser at IBM.

About the Author

Varun Haran

Varun Haran

Managing Director, Asia & Middle East, ISMG

Haran has been a technology journalist in the Indian market for over six years, covering the enterprise technology segment and specializing in information security. He has driven multiple industry events such as the India Computer Security Conferences (ICSC) and the first edition of the Ground Zero Summit 2013 during his stint at UBM. Prior to joining ISMG, Haran was first a reporter with TechTarget writing for SearchSecurity and SearchCIO; and later, correspondent with InformationWeek, where he covered enterprise technology-related topics for the CIO and IT practitioner.

Around the Network

Our website uses cookies. Cookies enable us to provide the best experience possible and help us understand how visitors use our website. By browsing, you agree to our use of cookies.