Messenger Kids | |
Logo Size: | 150px |
Developer: | Meta Platforms |
Ver Layout: | stacked |
Operating System: | Android, iOS |
Platform: | Communication |
Language: | English |
Genre: | Instant messaging, VoIP |
License: | Freeware |
Messenger Kids is an instant messenger introduced by Meta Platforms in December 2017, aimed at providing a secure alternative for a younger audience compared to the standard Messenger platform. Initially launched exclusively for iPad tablets running the iOS operating system in the United States, subsequent updates extended support to iPhone and Android devices, reaching additional markets such as Canada, Peru, and Mexico.[1] [2]
Designed with a focus on child safety, the platform allows users to register using their first and last names rather than phone numbers. Parental oversight is a key feature, encompassing identity verification and the approval of contacts. The absence of in-app purchases and advertisements, coupled with a commitment to refraining from data collection for advertising purposes, distinguishes Messenger Kids. Importantly, children's accounts remain non-searchable on Facebook, and the platform does not automatically transition a child's account into a full-fledged Facebook account upon reaching the minimum registration age of 13. Noteworthy features of Messenger Kids include augmented reality filters and lenses, as well as the inclusion of games and educational content. [3] [4]
Despite its certification under the Children's Online Privacy Protection Act (COPPA), the platform has faced criticism. Concerns have been raised regarding the collection of message and photo contents from minors, with particular scrutiny directed at the perceived effort to engage users in the Facebook experience from a very young age.[5] [6] Public figures, including UK's Secretary of Health Jeremy Hunt, publicly criticized the initiative, expressing reservations about Facebook's involvement with younger children. Criticism also emerged from child-development experts, such as pediatricians, educators, and advocacy organizations, who argued that the app appeared to be creating a need rather than addressing an existing one. The Federal Trade Commission (FTC) highlighted Meta's misrepresentation of parental control capabilities, noting instances where children were able to communicate with unapproved contacts in group chats and video calls, contravening the platform's stated principles.[7] [8]