38.2 F
New York

tjvnews.com

Saturday, January 17, 2026
CLASSIFIED ADS
LEGAL NOTICE
DONATE
SUBSCRIBE

Mark Zuckerberg’s ‘Meta’ Faces Shareholder Revolt Over Child Safety Concerns

Related Articles

Must read

Getting your Trinity Audio player ready...

Mark Zuckerberg’s ‘Meta’ Faces Shareholder Revolt Over Child Safety Concerns

Edited by: TJVNews.com

Meta, the tech giant helmed by Mark Zuckerberg, braced for a significant shareholder revolt at its annual meeting on Wednesday. The central issue is a push for greater transparency regarding Meta’s efforts to protect children online, a matter that has garnered increasing scrutiny amid rising legal and regulatory challenges.

A group led by Lisette Cooper, vice chair of Fiduciary Trust International (a subsidiary of Franklin Templeton), is spearheading the charge. According to a report in The New York Post, Cooper, who is also the parent of a child sex abuse survivor, is backing a non-binding resolution that urges Meta’s board to publish an annual report detailing the company’s performance in safeguarding children on its platforms. The report added that the proposed report would require quantitative metrics to assess whether Meta has improved its performance globally in terms of child safety and actual harm reduction to young users.

“If they want to reassure advertisers, parents, legislators, shareholders on whether they’re making a difference on dealing with this problem of harm to children, they need to have transparency,” Cooper stated in an interview with The Post. “They need better metrics.”

“Children are going to be the users of the future. If they have a bad experience on the platform, they are not going to keep coming back. This makes a huge difference to us as investors,” Cooper added.

This push for transparency comes at a tumultuous time for Meta, which faces legal challenges both in the United States and abroad. The Post report indicated that internationally, Meta is under scrutiny by the European Commission, which is investigating potential violations of the Digital Service Act (DSA). The Post report indicated that this new law mandates that large tech firms such as Meta actively police content on their platforms. European regulators are concerned that Facebook and Instagram may contribute to behavioral addictions in children and the so-called “rabbit-hole effects,” where children remain engaged with harmful content for extended periods. If found in violation, Meta could face fines amounting to 6% of its annual revenue.

Meta’s board of directors has expressed opposition to the proposed resolution. In an April proxy statement, the board argued that the “requested report is unnecessary and would not provide additional benefit to our shareholders,” as per The Post report. This stance has drawn further criticism from child safety advocates and shareholders who argue that the lack of transparency undermines trust and accountability.

Cooper and her allies cite a growing body of litigation against Meta related to child safety issues. In October of last year, Meta was sued by dozens of states alleging that the company had “ignored the sweeping damage these platforms have caused to the mental and physical health” of young users, the report in The Post said. The lawsuits highlight issues such as poor sleep, disruption to schoolwork, anxiety, and depression, all allegedly exacerbated by Meta’s platforms.

One notable lawsuit, filed by New Mexico’s attorney general, alleges that Meta has exposed underage users to potential sex predators. This lawsuit is part of a broader wave of litigation that criticizes Meta’s handling of child safety.

In addition to these legal challenges, Meta is actively lobbying against two bills in New York designed to protect children online, as was revealed in The Post report. These bills aim to impose stricter regulations on tech companies to safeguard young users, a move that Meta is reportedly trying to weaken or kill through extensive lobbying efforts.

Two of the largest proxy advisory firms, Institutional Shareholder Services (ISS) and Glass Lewis & Co., have recommended that shareholders vote in favor of the resolution. Glass Lewis stated that the requested report would provide shareholders with valuable information, helping them understand Meta’s efforts to minimize harmful content, according to the information provided in The Post report. ISS echoed this sentiment, noting that shareholders would benefit from additional information on how the company is managing risks related to child safety.

Despite the support from proxy advisory firms and a significant portion of the shareholder base, the resolution faces an uphill battle. The Post report indicated that Mark Zuckerberg controls 61% of Meta’s voting power through his ownership of super voting Class B shares. This concentration of voting power effectively allows Zuckerberg to block any resolution he opposes.

However, Proxy Impact, which filed the resolution on Cooper’s behalf, highlighted in a filing that a similar proposal at last year’s annual meeting received nearly 54% support from shares not controlled by Meta management, The Post report said. This indicates a strong level of concern among independent shareholders about the company’s handling of child safety issues.

Michael Passoff, CEO of Proxy Impact, emphasized to The Post that the fundamental business principle that “what gets measured gets managed.” This mantra underscores the need for Meta to collect and disclose data on its child safety initiatives.  Speaking to The Post, Passoff criticized Meta for not providing accessible information on their efforts, stating, “This is like a basic first step for any business plan – get the data. What gets measured gets managed, and they’re not doing that. Or if they are, they just aren’t making it available to anyone.”

In response to growing concerns, Meta’s proxy filing outlines several steps the company has taken to enhance online child safety. These measures include the development of over 30 tools designed to support teens and families across Meta’s suite of apps. Additionally, Meta maintains policies that prohibit harmful content aimed at exploiting children.

“We want people, especially young people, to foster their online relationships in a safe, positive, and supportive environment, and we work closely with a broad range of stakeholders to inform our approach to safety,” the company stated, as was reported by The Post.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest article