vAIsual launches Dataset Shop for legally clean face biometrics training data
vAIsual is commercializing its ethical training data for face biometrics algorithms with the launch of its Dataset Shop.
Companies can purchase legally clean datasets to act as the raw ingredients to synthetically generate content for training AI from the Dataset Shop.
According to company CEO Michael Osterrieder, the project effectively solves the issues connected with myriad photographs being illegally scraped from the internet to train AI models.
“Developers of AI now have no excuse to steal images from the internet to train their algorithms. It’s time for the industry to move from the wild west and start acting in a legitimate and professional way,” Osterrieder says.
“We don’t just pay lip service to AI ethics. It is core to who we are as individuals and as a company,” he says. “Ensuring that we adhere to every single law regarding human biometrics is of great importance to us.”
Dataset Shop’s release comes months after vAIsual claimed to have assembled a “legally clean” 500,000-photo dataset. Augmentation, Osterrieder says, could boost that total to 2 million.
vAIsual received a Visual1st Conference’s 2021 Innovation Award and a commendation from Anna Dickson, visual lead at Google’s Images and Search divisions.
“By generating their own datasets, [companies] can acquire biometric releases and allow for self-identification, creating a unique and diverse dataset that can be leveraged to not just create synthetic media but to also generate algorithms to understand meaning and context of visual data,” Dickson said at the time.
The unveiling of Dataset Shop comes amidst a marked increase in the use of synthetic data to train biometric algorithms.
For instance, D-ID said weeks ago it was launching a video creation platform for generating synthetic media as personalized videos with what the company said were hyper-real AI presenters.
“Only time will tell how the law intervenes in this nascent industry that is growing exponentially,” Osterrieder admits.