The U.S. Federal Trade Commission said it would follow the letter of the law when it announced its Notice of Proposed Rulemaking regarding trade oversight and lax data security in August, which meant a strong stakeholder consultation was forthcoming. This process began in earnest with an exchange of views from affected parties at the FTC’s virtual public forum on September 8.

The official comment deadline for the ANPRM and the 95 questions it poses will not end until October 21, but the forum has served as a stepping stone to a formal proposal for regulation. In addition to providing a platform for the agency to gather feedback, the forum offered panel discussions for industry players and consumer advocates to shed light on their expectations and hopes for the rules. finals.

FTC Chairman Lina Khan stressed that anything put on the table during the forum would be considered as the commission “determines how to proceed both in determining whether to proceed with a proposed rule as well as the form a proposed rule might take”.

“We know that today’s digital tools can offer enormous conveniences, but we also know that these tools and the business models behind them can also be used to track and monitor individuals in entirely new ways” , Khan added.

Industry Talks Best Practices and Clarifies Rules

FTC Senior Counsel Olivier Sylvain walked industry officials through the prompts generated by the pressure point commissions on how the rules will affect businesses across industries. The FTC asked whether companies have learned lessons or created best practices related to consumer harm and asked what kinds of rules would create a balance between real consequences and benefits for consumers.

In the hypothetical rules conversation, Jason Kint, CEO of Digital Content Next, suggested the FTC formulate safeguards to limit the use or tracking of out-of-context data, saying it would be a step in the right direction. direction.

“If you choose to visit a website…using this data in order to make product recommendations or target advertising to it, all of this is in the same context. The data is collected while you choose to use this application or website and used in that specific context,” Kint said. “It is not another party that you do not choose to interact with that collects this data or informs its algorithmic recommendations.”

Kint said the core of the problem lies with large companies that can use data out of its original context, noting how a search engine can use data for its advertising business as an example of non-consensual use and deception. He said everyone should “play by the same rules” but the rules need to be tougher for “companies that have a dominant position”.

The prevalence of using data out of context is because it has become “too easy” to do so, according to Mozilla security director Marshall Erwin. It is one thing to grow a business and the services it provides based on the collection and use of data, but businesses stray further from the principles of purpose limitation and proper safeguards, knowing that they likely not face deterrent consequences beyond a one-time fine that will force them to make changes.

“The internet is basically kind of a no-consequences zone right now,” Erwin said. “A lot of what we do in browsers is to crack down on certain behaviors like cross-site tracking. Users don’t understand that, it’s opaque and it violates their privacy. What we see as a crackdown on that diminishes the benefits… making commercial monitoring practices less effective and I think that’s what we’re doing. But on the other hand, it actually creates a real cost when there’s bad behavior. Like penalties financial resources are a meaningful way to get the ball rolling.

Erwin also brought data security issues into the best practices dialogue. He said encryption in transit, multi-factor authentication and password requirements are “basic things that everyone should do to reduce a lot of risk for many consumers” and then “take a closer look at a company” if these bases are not in place. square.

Consumers in crisis, necessary changes

While the FTC’s potential rules could benefit businesses in terms of knowing limits and promoting better customer relations, the forum’s consumer advocacy panel sought to justify these potential gains, outlining some of the the most egregious impacts that the growing data economy and subsequent bad practices have on individuals.

Deputy Director of the Electronic Privacy Clearinghouse, Caitriona Fitzgerald, explained that monitoring internet browsing and app activity is “most problematic” because “it is unavoidable and beyond what consumers reasonable can grasp or understand”. She noted how data collection in these contexts reveals a range of sensitive information or characteristics that is then passed on to “hundreds, if not millions” of companies for various secondary uses, all of which pose risks and harms. not consensual.

The panel explored the discriminatory harms of commercial surveillance, including discriminatory advertising and ad auctions involving marginalized communities. Joint Center for Political and Economic Studies President Spencer Overton said ongoing digital damage “threatens to widen” existing disparities for minorities, while Upturn executive director Harlan Yu suggested available data practices such as one-click background checks or similar tools that rate people based on discriminatory traits. become more common problems.

Consumer consent also remains a difficult area for individuals. The industry panel highlighted the ups and downs of global privacy control, but noted that it could ultimately be a useful tool once fully understood by businesses and consumers. Regardless of GPC adoption, advocates don’t want companies to rely solely on user choice.

“Individual choice is relevant, but shouldn’t be the only determining factor in these kinds of balancing tests when we’re talking about fairness,” said Stacey Gray, CIPP/US, Senior Director of the Future of Privacy Forum for American policy, noting that practicality is the key reasoning for this position. She cited California’s opt-out privacy regime and the burden of opt-out requests made to hundreds of companies as examples of how consent becomes “meaningless if not harmful.”

Gray also asked the FTC to keep an open mind about consent issues regarding emerging technologies.

“We are moving more and more into a world involving voice-enabled devices, smart city devices and connected vehicles. In all these contexts, you are faced with a situation where it is not possible to obtain (the consent)…or where it’s just a bad idea to give individual choice,” Gray said. “We need other safeguards and getting out of the notice and choice framework. We have to talk about data minimization, pseudonymization, anonymization. All of these (are) ways of mitigating risk without placing a burden on the individual.”