Machine learning algorithms may change computing, but they’re a bit of a black box. Still, there are ways to tame them with flexible data governance, according to tech startup exec Andrew Burt.
As a lawyer on the staff of the FBI Cyber Division, Andrew Burt spent a good deal of time looking at the intersection of national security and technology. That meant looking at policy in an organization charged to look at massive amounts of sensitive data. Now, as chief privacy officer and legal engineer at startup Immuta Inc., he is one among a new cadre working to bring more data governance to machine learning, the artificial intelligence-style technology that is moving from laboratories into mainstream computing.
Machine learning algorithms are something of a black box for governance, as the technology does not necessarily disclose how it reached its decisions. To cast some light on this black box and what it means to data governance, we recently connected with Burt to discuss sensitive data processing at scale.