Over the past decade, business models that rely on collecting and selling massive amounts of consumer data have proliferated, despite being misleading, damaging, and downright scary.

Known as commercial surveillance, companies collect large amounts of data about people and use that data for purposes only tangentially related to their own products or services, such as profiting from the sale of that data or targeting advertising.

This can result in restrictions on opportunity and access, increased prices, invasions of privacy and increased surveillance by law enforcement – and these harms systematically and disproportionately affect Black and brown communitiesas a Public Citizen report points out.

The bad news is that the United States lags far behind other countries when it comes to online privacy. The good news is that the Federal Trade Commission is finally on the case.

And it was time! A June poll by Morning Consult found that more than 80% of voters in both parties want stronger data privacy protections. Years of polls on this issue have found similar, if not higher, levels of support.

In August, the FTC issued a Notice of Proposed Rulemaking exploring the possibility of regulating commercial surveillance. The deadline for public comment on the proposal is today.

In an ideal world, Congress would pass a national online privacy law to establish minimum security standards. But with two years of divided government on the horizon and bipartisan agreement on these elusive issues, crafting FTC trade oversight rules will likely be the biggest battleground for new federal privacy safeguards in line.

The scope of FTC regulation is broad because the practice of commercial surveillance has become so ubiquitous. Here are three of the main issues the agency needs to address.

First, clicking “accept” on excessive fine print is not consent.

Many of today’s data privacy protections rely on long, vague, take-it-or-leave-it terms of service that users must agree to. It would take dozens of hours to read all the terms of service that a typical person is required to “agree to”.

It is unreasonable to expect ordinary people to read dozens of pages of fine print. Further, clicking an “accept” button is not meaningful consent to the wide range of abusive and intrusive practices that constitute the surveillance economyas George Washington University legal research shows.

Second, commercial surveillance algorithms infringe on civil rights.

Commercial monitoring data can be used to target advertisements to specific consumers, but under the current rules, it is not limited to this use. Predictive algorithms – formulas that attempt to anticipate the likelihood of certain social and behavioral outcomes – aggravate racial discrimination and bias, which can lead to serious economic, physical and social damage.

Among other things, algorithms drive up prices, lower credit scores, make it harder to get loans, reduce educational opportunities, increase criminal penalties, decrease health outcomes, promote health disparities. employment and can lead to more intrusive surveillance by law enforcement.

And third, commercial surveillance harms competition, product quality and accessibility.

Monopolistic companies can collect and process hordes of data themselves, which they then use to give away their own products preferential treatment on their platforms. This not only excludes competitors’ products, but also prevents people from buying better or cheaper products.

This, in turn, prevents startups and entrepreneurs who invent better and cheaper products from profiting from their innovations. In some cases, this prevents people from fully innovating.

Shutting down new businesses and new ideas is especially damaging to communities of color, who are excluded from asset building due to the anti-competitive behavior of Big Tech companies.

Furthermore, monopolistic firms tend to have lower wages, reduce workers’ influence and exacerbate existing social inequalities.

Here’s what the FTC can do to fix these problems.

More importantly, the FTC can limit the data collected about people in the first place by outlining acceptable uses of the data and prohibiting those that are contrary to consumer interests.

Collecting less information would eliminate some of the problems with predictive algorithms and pave the way for new competitors that don’t have the advantage of a huge pool of data.

Additionally, greater transparency, periodic audits, and testing requirements would help mitigate consumer harm and begin to enforce civil rights principles on the Internet.

After the FTC considers the comments it receives on its rule proposal, the agency will likely develop and present a proposed rule, which will also be subject to a public comment period. Additionally, the agency will likely hold public hearings on any new rules it proposes.

Today’s comment deadline marks the start of what will hopefully be a strong national standard for online privacy – and hopefully one that brings us in line with the rest of the world. .

Emily Peterson-Cassin is the Digital Rights Advocate for Public Citizen. She wrote this column for The Dallas Morning News.

We welcome your thoughts in a letter to the editor. Consult the instructions and drop your letter here.