WASHINGTON — The Senate overwhelmingly passed legislation Tuesday designed to protect children from dangerous online content, pushing forward with the first major effort by Congress in decades to hold tech companies more accountable for the harm that they cause.

The bill, which passed 91-3, has been pushed by parents of children who died by suicide after online bullying or have otherwise been harmed by online content. It would force companies to take steps to prevent harm on online platforms frequently used by minors, requiring them to exercise “duty of care” and ensure they generally default to the safest settings.

The House has not yet acted on the bill, but Speaker Mike Johnson, R-La., has said he is “committed to working to find consensus.” Supporters are hoping that the strong Senate vote will push the House to act before the end of the session in January.

The bill is about allowing children, teens and parents “to take back control of their lives online,” said Sen. Richard Blumenthal, D-Conn., who wrote the bill with Sen. Marsha Blackburn, R-Tenn. He said the message to Big Tech is that “we no longer trust you to make decisions for us.”

The bill would be the first major tech regulation package to move in years, and it could potentially pave the way for other bills that would strengthen online privacy laws or set parameters for the growing use of artificial intelligence, among others. While there has long been bipartisan support for the idea that the biggest technology companies should face more government scrutiny, there has been little consensus on how it should be done. Congress passed legislation earlier this year that would force China-based social media company TikTok to sell or face a ban, but that law targets only one company.

“This is a good first step, but we have more to go,” said Senate Majority Leader Chuck Schumer, D-N.Y.

If the child safety bill becomes law, companies would be required to mitigate harm to children, including bullying and violence, the promotion of suicide, eating disorders, substance abuse, sexual exploitation and advertisements for illegal products such as narcotics, tobacco or alcohol.

To do that, social media platforms would have to provide minors with options to protect their information, disable addictive product features and opt out of personalized algorithmic recommendations. They would also be required to limit other users from communicating with children and limit features that “increase, sustain, or extend the use” of the platform — such as autoplay for videos or platform rewards.

The idea, Blumenthal and Blackburn say, is for the platforms to be “safe by design.”

“The message we are sending to big tech is that kids are not your product,” Blackburn said at a news conference as the Senate passed the bill. “Kids are not your profit source. And we are going to protect them in the virtual space.”

Some tech companies, like Microsoft, X and Snap, are supporting the bill. Meta, which owns Facebook and Instagram, has not taken a position.