New Delhi: Anthropic, an AI startup, has changed its rules so that teens can use its AI platform. They said that developers who make apps for kids can use Anthropic’s AI as long as they put in certain safety features and tell users they’re using Anthropic’s tech.
Anthropic listed some safety rules for developers to keep kids safe while using these apps. This includes making sure kids are the right age, checking the content, and teaching kids how to use AI responsibly. Anthropic might also give developers special tools to make sure the apps are safer for kids.
If developers use Anthropic’s AI, they have to follow rules about kids’ safety and privacy, like COPPA. Anthropic will check these apps regularly to make sure they follow the rules. If not, they might suspend or stop the app.
Anthropic has expressed that AI has the potential to greatly benefit teens in understanding complex theories and concepts, making it a valuable tool in the education sector. This sentiment is echoed by other industry leaders such as X, formerly Twitter and OpenAI, who are also exploring the development of AI for teens and students.
This instance creates significant hypes, people are also worried about the negative effects of using AI in education specially for teens and students.
UNESCO, a group that helps with education, says governments should make rules about how kids can use AI. They say there should be age limits and regulations to keep kids’ data safe. They think AI can help kids learn, but it needs to be done right way to avoid problems and its effects.