The ubiquitous topic of private data is one I would imagine we will never fully know or understand. Artificial intelligence manipulating our data in exponential ways that surely could never be traced to one particular input from your keystrokes or mouse. Besides, what can one person do to stop big tech and its insatiable desire for information?
In steps one person in the form of California Governor Gavin Newsom, who signed into law amendments to the California Consumer Privacy Act (CCPA), which are the most sweeping state data privacy regulations in the country. The law, which takes effect on Jan. 1, regulates how data is collected, managed, shared and sold by companies and entities doing business with or compiling information about California residents. My guess would be that Newsom could care less about you, but this act will put a bump in the road to finding illegal aliens and those involved in sanctuary cities.
Several of the major benefits to consumers in California are the right to ask firms to disclose with whom they share the data and also opt out of their data being sold. For the data collectors, the downside is enormous. According to a 2019 survey of senior executives by Gartner, the acceleration of privacy regulation and related regulatory burdens is the top emerging risk faced by companies globally.
The irony is that your data is needed to make artificial intelligence work for you, like Siri or other devices that remember your commands and take them to the next level. As such, companies and computer scientists are collaborating to provide computational and business solutions to strengthen data protections while not hampering innovation and operational efficiency.
While large companies like Google, Amazon et al are currently paying lip-service to the consumer, third-party apps on sites like Instagram have not been monitored in the past. Who is to say whether this data gets back to the major platform provider or not. Facebook said it has suspended tens of thousands of sketchy apps from 400 developers. Amazon also is cracking down on third-party apps for breaking its privacy rules, while Apple said it will no longer retain audio recordings that Siri collects by default, among other things.
Self-regulation by business is noble, but is like the proverbial rooster set loose in the hen house. They wink and nod to congress and say that all is well. It is somewhat unrealistic to think that liberal California will be able to outspend and outthink their enemies in Silicon Valley in monitoring these companies. “Data privacy laws remain necessary because companies have to be prodded to adopt them, says Michael Kearns, founding director of the Warren Center for Network and Data Sciences. The idiom caveat emptor has never had such meaning. According to Elea Feit, senior fellow and Drexel University marketing professor, “Every time you interact with the company, you should expect that the company is recording that information and connecting it to you.”
So what is the eventual solution to this data dilemma? Kearns sees, “A future in which regulators themselves start employing algorithmic tools. That’s because when the companies you’re trying to regulate are doing things with massive amounts of data and at a massive scale, and you’re trying to spot misbehavior, you have to be ready to spot misbehavior at that speed and scale also.” Until we can find a good balance between user privacy and companies’ need for data, this is a problem that’s rooted deep in the way we act as a society.