The FTC referred a complaint to DOJ against TikTok and parent company ByteDance for potential children’s privacy violations, the agency announced Thursday. The commission voted 3-0-2 to refer the complaint to the department. Republican Commissioners Andrew Ferguson and Melissa Holyoak, who joined the commission in March, were recused. The FTC’s investigation of the companies started during a compliance review associated with the agency’s 2019 settlement over Children’s Online Privacy Protection Act allegations against Musical.ly, TikTok’s predecessor, the commission said in a statement. In addition, the commission was investigating additional potential violations of COPPA and the FTC Act. “The investigation uncovered reason to believe named defendants are violating or are about to violate the law and that a proceeding is in the public interest, so the Commission has voted to refer a complaint to the DOJ, according to the procedures outlined in the FTC Act,” the commission said. Typically, the FTC doesn't announce publicly that "it has referred a complaint," it said. However, "we have determined that doing so here is in the public interest.” The commission looks forward to collaborating with DOJ, the agency said. TikTok said in a statement Tuesday it’s worked with the FTC for more than a year to “address its concerns” and is “disappointed the agency is pursuing litigation instead of continuing to work with us on a reasonable solution.” TikTok “strongly” disagrees with the allegations, which relate to “past events and practices that are factually inaccurate or have been addressed,” the company said.
Social media companies should display warnings about mental health risks associated with their platforms, U.S. Surgeon General Vivek Murthy said in a New York Times opinion piece Monday. Murthy has been examining the impacts of social media on youth mental health (see 2305230062). “A surgeon general’s warning label, which requires congressional action, would regularly remind parents and adolescents that social media has not been proved safe,” Murthy wrote. Cigarette packaging has carried warning labels since 1966 and Murthy believes similar warnings might reduce mental health impacts. Sens. Marsha Blackburn, R-Tenn., and Richard Blumenthal, D-Conn., issued a joint statement supporting the surgeon general’s announcement and urging Congress to pass their Kids Online Safety Act (see 2308040048).
The EU should focus on delivering concrete results for AI investment and coordinating with member states so regional startups can compete globally, European Court of Auditors member Mihails Kozlovs said Friday. The court held a hearing with EU policymakers, including European Parliament member Dragos Tudorache, a key negotiator for the EU AI Act (see 2405290007). The European Court of Auditors recently said the EU is lagging the U.S. and global leaders on AI development (see 2405290038). It found “weaknesses in implementation and performance monitoring,” said Kozlovs. He noted a 2018 EU strategy called for the EU economy to be the “world-leading region for cutting-edge ethical and secure AI.” The U.S., the UK and China have recognized the “criticalness” of AI and outlined “ambitious” strategies, he said. Meanwhile, the EU hasn’t updated its AI investment targets since 2018, and it’s unclear how each member state will contribute, he said. “This pinpoints the need for increased focus on delivering results and better coordination with member states,” he said. “The ultimate goal should be to build an attractive and effective AI ecosystem in Europe, where AI startups could scale and grow to a competitive level globally.” Tudorache said AI is an “enabling technology” that will affect every sector of the economy. The EU being the first “jurisdiction” in the world with a comprehensive AI regulation is a “good thing,” and it provides a global model, he said. But the “hard work,” including implementation of technical standards, “starts now,” he noted: Funding AI workforce training will be the most important investment.
The FTC should finalize its privacy rulemaking before year's end, more than 30 organizations urged Chair Lina Khan in a letter Thursday. Signers included Fight for the Future, Demand Progress Education Fund, Center on Race and Digital Justice, Athena Coalition, Free Press, MediaJustice and Consumer Federation of America. Khan’s FTC first sought public comment on a potential rulemaking in August 2022 (see 2208110068). Since then, the “harmful impacts of unregulated surveillance and data collection have worsened,” the groups said, citing the rise of AI technology. They cited Amazon’s biometric surveillance using Ring technology and Meta’s tracking of users across the internet as examples. Groups are “frustrated” with the agency’s “lack of action” since the initial announcement, they said: “As core privacy rights are being challenged and data surveillance corporations are finding new ways to extract even more personal, sensitive data from individuals, we implore the FTC to put forth the NPRM on commercial surveillance.” The agency declined comment.
A California bill that would force tech platforms to pay for news content would undermine the independence of journalists by making them “dependent on government handouts,” NetChoice said Wednesday. Tech associations oppose the California Journalism Preservation Act (AB-886), a bill the Assembly approved, but the Senate failed to pass in 2023. The bill was re-referred to the state's Senate Judiciary Committee on Monday and is scheduled for a June 25 hearing. Assemblymember Buffy Wicks (D) introduced the legislation. Nothing in the bill ensures news organizations will use the proceeds to better fund journalism efforts, Vice President Carl Szabo said. In addition, it would let the government decide who’s a journalist, while forcing companies like Reddit and Pinterest to pay millions to major news companies, he said.
The federal government should rely less on Microsoft for information technology services given the company’s 2023 cyber breach, tech associations wrote to the Biden administration and Congress on Wednesday. Microsoft President Brad Smith is scheduled to testify before the House Homeland Security Committee on Thursday in a hearing about the company’s July cyber breach (see 2404080054). The incident, which has been attributed to Chinese hackers, exposed 22 organizations and 500 consumers who do business with Microsoft. The Department of Homeland Security’s Cyber Safety Review Board in April described the attack as “preventable” and blamed Microsoft for a “cascade of errors” and a lack of investment in security standards. NetChoice, the Computer Communications Industry Association, the Software & Information Industry Association, Internet Infrastructure Coalition and the Coalition for Fair Software Licensing wrote the joint letter. “This over-reliance on single vendors is a growing concern, as many public sector organizations worldwide are using the same provider for everything from operating systems to security tools,” they wrote. The organizations recommend the government review past security performance more thoroughly in the procurement process and assess concentration risks associated with overreliance on one vendor.
Four tech industry groups on Tuesday joined in opposing a kids’ social media legislative proposal advancing in Pennsylvania, despite support from their member Google (see 2406060062). The Computer and Communications Industry Association, NetChoice, TechNet and Chamber of Progress oppose the Online Safety Protection Act (HB-1879). Pennsylvania’s House Children and Youth Committee voted 15-9 to pass the bill Tuesday, with one Republican in favor. The legislation would require online platforms consider the “best interests of children” when developing products and features “children are likely to access.” Violators would face potential civil penalties enforced by the attorney general. CCIA and NetChoice have argued similar measures passed in California, Maryland and Vermont are unconstitutional, given the free speech implications for children. Committee staff on Tuesday listed Google as a supporter and the four associations as opponents. Google previously declined comment on why it supports the measure, and the company didn’t comment Tuesday. Chair Donna Bullock (D), who wrote the bill, successfully passed an amendment Tuesday with new language meant to address critics’ concerns about “vague” wording outlining what keeping children’s “best interests” in mind means. However, Rep. Charity Grimm Krupa (R) said the amendment fails to address concerns from Attorney General Michelle Henry (D) about enforceability. Krupa said she agrees with ranking member Barry Jozwiak (R), who previously said the bill is unenforceable due to its “overly broad” terms and definitions. The measure's intent is “good,” but sponsors haven’t addressed issues raised by Jozwiak, Henry and the industry groups, she said. Henry’s office didn’t comment Tuesday. Bullock said parents have an obligation to show children how to use social media platforms safely, but they can’t “do it alone.” Parents don’t understand every aspect of the technology and what’s “happening behind the scenes,” she said. Platforms should make these services “age-appropriate” and prioritize the safety of children over profits, she added.
Congress should create a federal commission to examine how law enforcement can better detect and prosecute AI-driven child abuse online, attorneys general from 41 states, the District of Columbia, Guam and the U.S. Virgin Islands wrote Monday. California, Mississippi, Colorado, Arkansas, New York, Utah and Maryland were among the AGs supporting the Child Exploitation and Artificial Intelligence Expert Commission Act (HR-8005). Introduced by Rep. Nicholas Langworthy, R-N.Y., HR-8005 is co-sponsored by 16 House members, including seven Democrats. “We are hopeful the creation and work of this commission will result in appropriate safety measures and updates to existing laws, so we can protect children from being digitally exploited and hold criminals accountable,” the AGs wrote.
The House Commerce Committee’s bipartisan privacy bill doesn't properly preempt state law, CTA, TechNet, NetChoice, Computer & Communications Industry Association and a coalition of industry groups wrote Monday in a letter to Chair Cathy McMorris Rodgers, R-Wash., and ranking member Frank Pallone, D-N.J. The House Innovation Subcommittee advanced a draft version of the American Privacy Rights Act (APRA) to the full committee in May (see 2405230056). APRA “falls short of creating a uniform national standard due to its inadequate federal preemption of the ever-growing patchwork of state privacy laws,” they wrote. “Without full preemption of state laws, APRA will add to the privacy patchwork, create confusion for consumers, and hinder economic growth.” The group behind the letter, the United for Privacy Coalition, includes ACT | the App Association, Chamber of Progress, Engine, Interactive Advertising Bureau, Information Technology Industry Council, Software & Information Industry Association and the U.S. Chamber of Commerce. They urged the committee to pass a “single, uniform national privacy standard.”
The Cybersecurity and Infrastructure Security Agency should consider allowing state and local governments to voluntarily comply with new rules under a 2022 cyber incident reporting law, the National Association of Secretaries of State told CISA in comments due last week. CISA is finalizing rules for the Cyber Incident Reporting for Critical Infrastructure Act, with requirements for critical infrastructure owners and operators (see 2203160051). The agency posted comments through Thursday. NASS membership includes top state election officials from 40 states and territories, including Alabama, California, Colorado, Florida and New York. Its comments note that state and local election officials share cyber information with CISA on a "well-functioning,” voluntary basis. Industry groups asked CISA in the past for narrow rules and to avoid overly burdensome reporting requirements for companies (see 2211290071). NASS is “concerned” the proposed rules may “disincentivize” state and local officials from participating in their “well-functioning voluntary partnership.” It continued, “CISA should prioritize continuing to maintain this voluntary partnership over imposing requirements on SLTT government entities.” The proposed rules are “overly broad and would strain the resources of SLTT government entities during a critical time for cyber incident response.” The incident reports would require hours or staff time, which is “challenging for state government entities and potentially impossible for many small local jurisdictions,” NASS said.