Meta has issued a lawsuit against a firm it accuses of generating tens of thousands of false Facebook profiles in order to harvest user data and offer monitoring services to its customers.
According to The Verge, Voyager Labs positions itself as the global leader in sophisticated AI-based investigative solutions. In reality, this entails sifting through massive amounts of social media content to draw conclusions about specific users.
The Guardian noted in 2021 that Voyager Labs had offered its services to the Los Angeles Police Department (LAPD) on the promise that it could identify people who would go on to commit crimes in the near future.
Filed Lawsuit
The lawsuit was disclosed in a blog post published by Meta on Thursday, Jan. 12, in which the company claimed that Voyager Labs had breached its terms of service.
Meta claims in a lawsuit complaint dated November 11, 2022, that Voyager Labs illegally utilized its surveillance software to harvest data from Facebook and Instagram via the use of over 38,000 fake user identities it established. Voyager Labs also gathered information from social media and messaging services, including Twitter, YouTube, and Telegram.
According to Meta, between July 2022 and September 2022, Voyager Labs utilized fake identities to steal data from over 600,000 Facebook profiles. Meta claims that on or around Jan. 12, it deactivated more than 60,000 Voyager Labs-related profiles on social media platforms such as Facebook and Instagram.
To get Voyager Labs to cease breaking its terms of service, Meta is asking the courts to ban the firm from using Facebook, Instagram, and other services associated with those platforms.
Also, Meta wants Voyager Labs to pay back its "ill-gotten profits in an amount to be proven at trial," as it claims the corporation made wealthy from the company without proper compensation.
Bold Claims
Many businesses, notably Palantir and its rival Voyager Labs, make bold claims about their ability to foresee criminal conduct based on an individual's prior actions and digital footprint.
The algorithms used in these technologies, according to experts, are too basic to reliably forecast criminal behavior. An internal investigation of one of the LAPD's data-driven projects in 2019 found that the technology was unreliable and exhibited racial prejudice.
As per Meta's head of platform enforcement and litigation Jessica Romero, "Companies like Voyager are part of an industry that provides scraping services to anyone regardless of the users they target and for what purpose, including as a way to profile people for criminal behavior."
She added that without supervision or responsibility and in a manner that may violate people's civil rights, such a business discreetly gathers information that individuals share with their neighborhood, family, and friends.