Regulating Facebook merely nips at the edge of a bigger problem

Consumers are under surveillance in ways we have just begun to grapple with

Rana Foroohar

Quickly conceived conventional wisdom is a terrible side-effect of the age of high-speed media. Storylines develop rapidly, as news organisations chase the most clickable themes. Subtlety and nuance about complex issues are the casualties.

Take, for example, the rash of recent pieces about how, in the wake of Mark Zuckerberg’s five-hour tech support call with Congress, Washington is finally going to get serious about curbing the big tech companies known collectively as Faangs: Facebook, Amazon, Apple, Netflix and Google. “Regulation is coming,” scream the headlines. “The Faangs are finished.”

As the Financial Times’ San Francisco-based Richard Waters points out, the correction in some Faang stock prices will not stop these companies from growing. Meanwhile, most of the regulatory “fixes” being discussed merely nip at the edges of a massive problem.

It is good that US politicians are trying to make social media platforms disclose the sources of political advertising and take responsibility for sex trafficking online. But does anyone really believe that slapping a few moderate regulations — ones that other businesses already have to deal with — on Facebook is going to solve the economic, political and social problems being caused by Big Tech?

Europe’s General Data Protection Regulation goes much further on privacy, of course. But that is just the tip of the iceberg. The big tech companies exacerbate problems — from monopoly power to the need for a new tax and education system to declining faith in liberal democracy — that are not just technical. They are existential.

In an ideal world, the structural shift from a tangible to an intangible economy would trigger deep thinking about digital property rights, trade regulations, civil liberties and so on. Policymakers would have robust conversations with experts from a broad range of disciplines about what the new framework for growth in the digital economy should look like.

We do not live in that world. I worry that public anger engendered by the big tech companies, coupled with the desire of some politicians to score quick points, may well lead us into the sort of regulatory paradigm we saw in the financial sector post-2008.

Lobbyists and vested interests on both sides of the political aisle came up with a complex stew of new laws. Some were good, some bad. The sheer complexity created plenty of loopholes for corporate lawyers to jump through. While risk-taking was curbed at some individual institutions, the system as a whole is no safer. We lost sight of the only question that mattered: how can we create a financial sector that serves the real economy?

We need to ask the same question now about the digital sector. Whatever regulators choose to do to the Faangs themselves, the “data horse” has already bolted. Data have become the most valuable commodity in the world and, as such, companies of every stripe have joined the big tech groups in collecting it. Athletic brands are putting chips in our running shoes that can track where and how we jog. Goodyear embeds sensors in tyres that can beam information about drivers’ performance back to their fleet headquarters. Tractor company Deere & Co recently paid more than $300m for Blue River Tech-nologies, a California start-up that uses deep machine learning to automate farm work. Algorithms stuffed with millions of pictures of cabbages can figure out which ones to spray with fertiliser and which to blast with herbicide.

Companies in every industry are counting on artificial intelligence to drive growth over the next several years. A McKinsey Global Institute report estimates that AI deployable now could create between $3.5tn and $5.8tn annually in value for companies, with the biggest potential gains in areas such as sales and supply chain management. Anecdotally, many companies I have spoken to say machine learning is increasing their return on investment by anywhere from 10 to 30 per cent.

But AI depends on data — the more of it you can stuff into the algorithms, the smarter they become. That means both people and products are under surveillance in ways we have only just begun to grapple with, and not just by the Faangs.

In Hawaii, the state tourism board has worked with online travel group Expedia to use facial recognition software to monitor travellers’ expressions. Users who opt in are monitored via computer webcams as they watch advertising for various kinds of vacations, and then given personalised offers.

In Europe, such tactics may require disclosure. GDPR rules stipulate that citizens have the right, for example, to an explanation for some decisions that are made by machines. That is a smart idea in principle, as algorithms are only as good as the people who are programming them. And yet, EU-style data laws may lead to a trade-off between privacy and economic competitiveness when data are the new oil.

We do not yet know the right balance. But two days of Congressional hearings starring Mr Zuckerberg, chief executive of Facebook, have not helped us to come up with the answers. We need to take the time to grapple with the seismic shifts being wreaked by the digital economy in a deep and real way.

0 comentarios:

Publicar un comentario