February 14, 2022
Commissioner, Federal Trade Commission
Senior Vice President, C_TEC, U.S. Chamber of Commerce
Technology continues to be one of the greatest drivers of economic growth in the U.S., with advancements opening doors to new opportunities and elevating different communities. However, with tech development also comes major challenges and risks.
The U.S. Chamber Technology Engagement Center (C_TEC) recently sat down with Christine Wilson, Commissioner at the Federal Trade Commission, to discuss issues on the FTC's agenda including data privacy, artificial intelligence, and ongoing process changes.
Consumers Should Know How Their Data Is Collected and Used
The question of how much consumers should know about the way businesses are collecting and using their data is a common controversy in the digital age.
“Consumers have very little understanding of how their data is collected, what kinds of data are collected, and how that data … is used, shared, and monetized,” said Wilson. “I believe these significant information asymmetries create a market failure.”
From the customer’s perspective, to analyze the cost and benefits of different products and services, customers must understand what data businesses are collecting and how they are using it to improve their experience.
“From a business perspective, I understand that complying with myriad and potentially conflicting laws is at best costly and, at worst, impossible,” said Wilson. “To compound the problem, there are gaps in existing sectoral laws that have created inconsistent privacy protections, and those gaps continue to widen.”
For example, in the health care sector, when a medical professional collects a patient’s information, the Health Insurance Portability and Accountability Act of 1996 (HIPAA) protects that information. For information collected through websites, apps, and wearables, however, HIPAA does not play a part.
According to Wilson, the COVID-19 pandemic has only exacerbated this issue. However, this is only one instance where we have gaps in existing legislation.
“I think it's important to note that the concern extends beyond consumer privacy,” she explained. “Certain uses of consumer data also have serious implications for civil liberties, including our protections under the first and fourth amendment. And these concerns also intensified during the pandemic.”
Preemption and Private Right to Action Are Not One-Size-Fits-All Approaches
Wilson noted that preemption and private rights to action are not all-or-nothing propositions — and Congress has not approached them as such in the past. In fact, federal statutes that preempt an entire field of law are not common.
In terms of private rights to action, which give citizens the legal entitlement to enforce their rights under a statute, Wilson is wary.
“I've had enough experience representing clients to understand … that abusive class actions increase costs for businesses,” she said, adding they also provide little in the way of deterrent or changed business practices.
“Instead of arguing about whether we should have private rights of action or no private rights of action, I think it would be more constructive to focus on establishing a constructive remedial framework, considering the policy goals of the regulatory regime and how best to accomplish those goals and how best to create appropriate levels of deterrence,” she said. “There isn't one right way to handle private rights of action and preemption. I encourage Congress to consider the many possible permutations, to choose one, and to get legislation in action.”
Artificial Intelligence Offers Great Benefits and Risks
While artificial intelligence has been and continues to be beneficial to many businesses, it also poses serious risks.
“We know AI can shrink the time and cost of executing complex tasks,” said Wilson. “We know AI can help industries rethink how to integrate information, analyze data, and improve decision-making. That said, I also recognize that AI carries with it potential pitfalls.”
For example, she noted, facial recognition technology might infringe on first and fourth amendment rights, especially if used without warrants against American citizens.
“These and other issues [have] led leading technology companies to announce a moratorium on the development or sale of this technology until there's a federal standard,” said Wilson. “The spectrums of mass surveillance and racial profiling and abuses of basic human rights and freedoms outweighed … the benefits for many companies that elected us to spend research and development in this space.”