Senators grilled the head of Instagram on Wednesday about the app's effects on children and teens, airing frustrations and attempting to extract various commitments from the company to make the platform a safer space for its youngest users.
But over two and half hours, Adam Mosseri skirted key questions from the Senate Subcommittee on Consumer Protection, Product Safety and Data Security. Instead, he defended Instagram and its parent company Meta — formerly Facebook — touting existing safeguards and some the company plans to roll out soon.
Mosseri is the highest-ranking company executive to address Congress since a whistleblower leaked internal Instagram research about how the app can exacerbate mental health harms for some teens.
While the tone of the hearing may have had the appearance of calm, senators did not hold back their disdain for Instagram and other big tech companies for not moving quickly enough to implement protections for kids and using vulnerable teens as "cash cows" to collect billions of dollars in profits.
Below are a few key takeaways from the confrontation.
Sen. Richard Blumenthal, who chairs the Senate's consumer protection subcommittee, opened the hearing by saying social media companies, including Instagram, fan the flames of teen mental health struggles by creating "addictive" platforms that are powered by algorithms designed to "exploit and profit from children's insecurities and anxieties."
The problem is dire and it's become a bipartisan issue, he said, adding that lawmakers have grown impatient waiting for Big Tech companies to responsibly course-correct.
"Parents are asking, what is Congress doing to protect our kids and the resounding bipartisan message from this committee is that legislation is coming," Blumenthal told Mosseri, adding, "We can't rely on self-policing. "
"Some of the big tech companies have said 'Trust us.' That seems to be what Instagram is saying in your testimony but self-policing depends on trust. The trust is gone."
Sen. Marsha Blackburn, R-Tenn., echoed similar sentiments, admitting she is "a bit frustrated."
"This is now the 4th time in the past 2 years that we have spoken with someone from Meta. The conversation continues to repeat itself ad nauseum.
"Tennesseans want Big Tech to be more transparent and to accept responsibility for your actions. And time and time again, you say things that make it sound like you are hearing us and agree – but then nothing changes," Blackburn said.
Blackburn, Blumenthal and other members of the committee were also critical of Instagram's announcement of new product updates on Wednesday at midnight PT, saying they are "too little too late."
Wednesday's hearing with Mosseri was one of several that legislators have held with leaders of giant tech companies in recent years. And while their frustrations seem universal, there does not appear to be a consensus around legislation and the extent of limits they'd like to impose.
In closing the hearing, Blumenthal offered an ominous sounding glimpse of the type of action Instagram can expect from Congress, saying, "We're going to move forward with specifics."
"I think you'll sense on this committee pretty strong determination to do something well beyond what you have in mind," he added.
Mosseri, a longtime executive at Facebook who was named head of Instagram in 2018, held to the message that Instagram plays a positive role in the lives of teenagers and that Meta is going to great lengths to be transparent with the public. He objected to Blumenthal's characterization of Instagram as addictive and said the research revealed by the whistleblower has been misinterpreted. He also urged the committee to take a closer look at Instagram rivals Tik Tok and YouTube, saying more teens are moving toward those platforms.
"I recognize that many in this room have deep reservations about our company," Mosseri told the committee. "But I want to assure you that we do have the same goal. We all want teens to be safe online."
Mosseri laid out a proposal to establish an industry body that would work together to determine best practices for verifying age, ensuring age-appropriate experiences, and creating more effective parental controls.
"The body should receive input from civil society, from parents, and from regulators," he said in a prepared testimony.
He did not respond to an accusation from Blumenthal that the company's midnight unveiling of new safety features was nothing more than a PR stunt ahead of the hearing. However, he did tout the new tools as much-needed solutions.
The new features include a "take a break" function that will nudge users to log off after a certain period of time and place limits on both unwanted interactions with adults and exposure to sensitive content. In the spring, the platform will add the option of parental oversight of children's accounts.
Mosseri avoided questions about sweeping changes to Instagram for users under 18 by stressing the importance of parental autonomy in making social media decisions that best suit their children.
When asked if all ads should be banned for users under 18, he responded that Meta adheres to strict rules regarding advertising to minors. When asked if Instagram might consider no longer suggesting followers for young users because they can lead to harmful content, he suggested that would interfere with them pursuing their interests.
When asked if the U.S. should implement the same child protective standards as the U.K., Mosseri said he believes in child protection standards but did not provide a yes or no answer, as he was asked.
When asked if Instagram would agree to share its data and research findings with the committee, he extolled the virtues of transparency but said there are a number of complicating factors when retrieving the information. Ultimately, he said, lawmakers need to ask for specific data sets in order for the information to be retrieved.
Late in the hearing, a frustrated Blumenthal said, "Your answers are too vague to make it impossible for us to make any decisions."
Throughout the proceedings, several senators revealed that they created fake Instagram accounts to test how effectively the company is adhering to its own guidelines and regulations.
The results were grim.
In many instances the fake users were quickly hit with disturbing content glorifying anorexia and other eating disorders as well as content promoting body dysmorphia.
Sen. Blumenthal disclosed that on Monday his office repeated an experiment they first tried two months ago; they created an account for a fictional teen and began following accounts promoting eating-disorder content.
"Within an hour all of our recommendations promoted pro-anorexia and eating disorder content," Blumenthal said. "Nothing has changed. It's all still happening."
Sen. Mike Lee, R-Utah, said his office created an account for a 13 year old girl. Shortly afterward, the algorithm recommended a famous female celebrity to follow and when they did, Lee said, "It went dark fast."
The fake account was flooded with content about diets, plastic surgery and other damaging material for an adolescent girl, he said.
In another example this week, Blackburn's staff exposed a flaw in Instagram's setting for teens under 16.
According to Instagram's policies, new teenage accounts should automatically default to a private setting. But when Blackburn's team set up a phony account for a 15 year old girl, it automatically defaulted to public.
Mosseri acknowledged the error, explaining the mistaken default setting was triggered because the account was created on a web browser, as opposed to a mobile app.
"We will correct that," he said.
Senators were keen to extract information about the future of Instagram Kids which has been put on hold by Mosseri.
Several committee members, including Blumenthal, used their time to ask about the proposed expansion of the platform to children between the ages of 10 and 12.
The project was blasted by lawmakers and parents following revelations about the toll on young people's mental health, especially girls who reported feeling worse about themselves after spending time on the app.
But on Wednesday, Mosseri would not commit to permanently ending the program, saying the company believes it could help parents better monitor their children's online activity.
He noted studies showing that an increasing number of children get a smartphone by age 11. By establishing a kids' version of the app, he told the committee, they can improve age verification measures and ensure that no child under 13 is on the platform without their parents' consent.