AI regrets, Google has had a few … so it’s selling that experience to developers
Google executives reportedly have agreed to begin selling the company’s insights into developing AI products. Is there no more revenue to be wrung from search?
It also is true that Google takes one step only when it has long-term plan to monetize the second. Google today reportedly sells software development tools that assesses algorithm fairness.
In theory, the market for AI services in general is broad. It ranges from businesses and municipalities buying fairly simple facial recognition tools to multinationals and national governments building sophisticated systems to the growing supply web between them all.
But advising businesses on a squishy and fraught topic like AI ethics — assuming buyers and makers of systems really care about this new field of ethics — would seem to be the domain of consultants and B schools.
News that Google is walking down this path comes from an exclusive, if thin, article in Wired magazine. In it, Tracy Frey, Google’s director of cloud AI product strategy and operations, says the company has advised some clients already, and more have approached Google for help with AI ethics.
No start date or prices for whatever is being developed is mentioned. Frey reportedly told Wired that the first effort could be showing customers what AI ethics problems might look like.
The story states that future efforts could be more traditional consulting services that would audit AI products for potential landmines.
For argument’s sake, a Google AIE (an acronym that does not in any way sound like an alarmed scream) program might grease the skids for some buy-curious businesses and offer others a legal fig leaf.
Not that one would necessarily look to Google for a sure hand on the AI policy tiller. In April 2019, the company assembled a panel of AI experts to review ethical issues. It was disassembled just two weeks later when Google employees said that panel’s makeup was inappropriate.
Now, Google (and others) are pulling back on the AI development throttle. The unintended message seems to be, “We rubbed the lamp, and the djinn granted some amazing wishes. But we’re going to just cork that bottle until we can be sure each wish does not come with a scary catch.”