AI Is Telling Bedtime Stories to Your Kids Now

Ghost Utmost

The Soul of the Internet
Supporter
Joined
May 2, 2012
Messages
19,964
Reputation
8,463
Daps
72,356
Reppin
the Aether
write one, and let's see if I can compete

I've written a few.

But we can compare your "work" to the existing cannon. There are plenty of screenplays written by humans that we can compare to that everyone has access to. You'd know all about that.

Just put one page of the AI script up.
 

bnew

Veteran
Joined
Nov 1, 2015
Messages
58,835
Reputation
8,672
Daps
163,048

Your Child’s Next Playmate Could Be An AI Toy Powered By ChatGPT​


0x0.jpg

ILLUSTRATION BY PHILIP SMITH FOR FORBES

Jan 20, 2024,06:30am EST



A host of startups are building robots and stuffed toys that can have full-fledged conversations with children, thanks to generative AI.​

By Rashi Shrivastava, Forbes Staff​



Six-year-old Sophia Valentina sits under a decorated Christmas tree as she unwraps her present: a tiny lavender-colored robot, whose face is a display and whose body is embedded with a speaker. “Hey Miko,” Sophia says, and the gadget lights up with round eyes and blue eyebrows.

In early December, Sara Galvan bought Miko Mini, a $99 robotic companion embedded with in-house AI models as well as OpenAI’s GPT-3.5 and GPT-4, with the hopes that it would help homeschool her daughters. Over the last month, Sophia has used Miko to solve math problems, listen to princess stories and ask questions like “how is Christmas celebrated,” Galvan said. “They begin to learn self-guided learning, which is huge for us with homeschool and helps expand their curiosity and their minds,” she said.

Miko, which can also play games like hide and seek, is part of a growing group of pricey GPT-powered robots rolling into the toy market. Some AI toys are touted as a screen-free form of entertainment that can engage children in conversations and playful learning, like Grok, a $99 AI plushie that can answer general questions (not to be confused with Elon Musk’s ChatGPT competitor Grok, though the toy Grok is voiced by his former girlfriend Grimes). Others claim to offer additional features beyond storytelling and learning activities. There’s Fawn, a $199 cuddly baby deer intended to provide emotional support, and Moxie, a $799 turquoise-colored robot that can recite affirmations and conduct mindfulness exercises. These robotic pals are designed to not only help children grow academically and improve communication skills but also teach them how to deal with their emotions during times of distress.

Sneh_Vaswani_Miko_Headshot


Sneh Vaswani, cofounder and CEO at Miko



COURTESY OF MIKO

Fostering social and emotional well-being is one of Miko’s intended functions, said CEO and cofounder Sneh Vaswani, who participated in several international robotics competitions before starting his company in 2015 and launching the first iteration of AI companion Miko in 2017. “Our goal is to help parents raise kids in the modern world by engaging, educating and entertaining children through multimodal interactions with robotics and AI,” he told Forbes.

Vaswani has sold almost 500,000 devices to date across more than 100 countries and expects to cross $50 million in revenue in the fiscal year ending in March 2024, he told Forbes. His Mumbai-based startup has raised more than $50 million and was last valued at about $290 million, according to Pitchbook.

Miko 3 Red

Miko 3

COURTESY OF MIKO

Miko is trained on data curated from school curriculum, books and content from partners like Oxford University Press and is built using proprietary technology including facial and voice recognition, recommendation algorithms and a natural language processing layer, Vaswani said. The bot is programmed to detect different accents and provide educational content catered to the geographic region where it’s sold. The company has also teamed up with media giants like Disney and Paramount, allowing them to publish their content on Miko.

“There could be a storytelling app from Disney or a Ninja Turtles app from Paramount,” he told Forbes, adding, “It’s like a Netflix plus an iPhone on wheels given to a child.”

Other toys were built out of a desire to bring fictional characters to life. Misha Sallee and Sam Eaton, the cofounders of startup Curio Interactive — and the creators of Grok — were inspired to create the rocket-shaped AI plushie thanks to fond childhood memories of watching movies like Toy Story. But making toys speak intelligently was a far-fetched idea until ChatGPT came out, Sallee said. Grok is built on a variety of large language models that help it act like a talkative playmate and an encyclopedia for children. Canadian musician Grimes invested in the startup and voiced the characters, which are a part of what Sallee calls a “character universe.”

“As a mother, it resonated with her. It was something that she wanted to lean in and collaborate on,” Sallee said. “She wanted a screen-free experience for her kids and for kids around the world.” (Grimes did not respond to a request for comment.)


“It’s like a Netflix plus an iPhone on wheels given to a child.”

Sneh Vaswani, CEO and cofounder of Miko

Another plush AI toy is Fawn, a baby deer programmed with OpenAI’s large language model GPT-3.5 Turbo and text-to-speech AI from synthetic speech startup ElevenLabs. Launched in July 2023 by husband and wife duo Peter Fitzpatrick and Robyn Campbell, Fawn was designed to help children learn about and process their emotions while maintaining the tone and personality of an eight-year-old. Still in its early stages, the startup plans to ship its first orders before the end of March 2024.

Robyn Campbell

Robyn Campbell, cofounder of AI startup Fawn Friends

COURTESY OF FAWN

“[Fawn] is very much like a cartoon character come to life,” said Campbell, who previously worked as a screenwriter at The LEGO Group. “We’ve created this character that has feelings, likes and dislikes that the child relates to.”

While generative AI is capable of spinning up make-believe characters and content, it tends to conjure inaccurate responses to factual questions. ChatGPT, for instance, struggles with simple math problems — and some of these AI toys have the same weakness. For instance, in a recent video review of GPT-powered robot Moxie, it incorrectly said 100 times 100 is 10. Paolo Pirjanian, CEO and founder of Embodied, Inc., the company behind Moxie, said that a “tutor mode” feature along with academic capabilities was announced in early January and will be available in the robots later this year. “Academic questions — paired with environmental factors like multiple speakers or background noise — can sometimes cause Moxie's AI to need further prompting,” Pirjanian said.

“If… the model invents an answer that’s not correct, that can create a serious misconception and these misconceptions are much harder to correct,” said Stefania Druga, a researcher at the Center for Applied AI at the University of Chicago.

In Fawn’s case, Campbell said the AI model has been stress tested to prevent it from veering into inappropriate topics of conversation. But, if the model makes up information, it’s often a desired outcome, Campbell said. “[Fawn] is not designed to be an educational toy. She's designed to be a friend who can tell you an elaborate story about a platypus. Her hallucinations are actually not a bug. They’re a feature,” she said.
 

bnew

Veteran
Joined
Nov 1, 2015
Messages
58,835
Reputation
8,672
Daps
163,048



THE CASE FOR THERAPY

For Moxie, the stakes are higher than other AI toys because it’s being marketed as a tool for social and emotional development. In 2021, Kristen Walmsley bought the robot on sale for $700 for her 10-year-old son, Oliver Keller who has an intellectual disability and ADHD. “We were really struggling with my son, and I was really desperate to find something that could help him. I bought it because it was advertised as a therapeutic device,” Walmsley tells Forbes.

Walmsley said that Oliver, who at first found the robot “creepy” and eventually warmed up to it, now uses it to share his feelings and recite positive affirmations. In one instance, when Oliver was overwhelmed and said he was feeling sad, the robot, which was already active and listening to the conversation, chimed in. “Sometimes I have to remind myself that I deserve to be happy. Please repeat this back to me: ‘I deserve to be happy,’” Moxie said.

Moxie_Embodied_Lifestyle_001 copy

Moxie
COURTESY OF EMBODIED, INC.

In another instance, Moxie and Oliver had a conversation about embarrassment and Moxie replied with affirmations about being confident. “It was impressive to see that it could do that because my son really struggles with low self esteem,” Walmsley said, adding that her son has repeated these affirmations to himself even when the robot is not around.

Paolo Pirjanian

Paolo Pirjanian, CEO and founder of Embodied, Inc.
COURTESY OF EMBODIED, INC.

Moxie’s latest version is embedded with large language models like OpenAI’s GPT-4 and GPT-3.5. Pirjanian claims that the robot can conduct conversations that are modeled after cognitive behavior therapy sessions, which can help children identify and speak about their source of anxiety or stress, and offer mindfulness exercises. Valued at $135 million, the Pasadena-based startup has raised $80 million in total funding from entities like Sony, Toyota Ventures, Intel Capital and Amazon Alexa Fund. “We have this thing called animal breathing where Moxie will breathe like different kinds of animals just to make it fun for children,” he said.

Miko, whose screen can be used to receive video calls through a parent app, will also offer a therapeutic experience for kids. Vaswani told Forbes that he plans to introduce a new feature that would allow human therapists to conduct teletherapy on the robot’s screen. Parents would have to grant access to the therapist to access Miko.

As of now, the tiny robot isn’t suited for emotional support. In a Youtube reviewof the robot, Sasha Shtern, the CEO of Goally, a company that builds devices for children with ADHD and autism, tells Miko “I am nervous.” The robot responds “It’s okay to feel nervous about medical procedures but doctors and nurses are there to help you.” Miko spoke about medical procedures even though Shtern never mentioned anything related to that.

“It was like talking to an adult who's watching a football game and heard half my question," Shtern said in the video.

And Fawn can coach a child about how to talk about stressful situations (like getting bullied in school) with an adult without feeling embarrassed, Campbell said. She told Forbes that Fawn’s conversational AI has been fine-tuned with scripts she wrote based on child development frameworks derived from books like Brain Rules for Baby and peer reviewed research. The duo also consulted an expert in child development while developing their product.


“[Fawn’s] hallucinations are actually not a bug. They’re a feature.”

Robyn Campbell, cofounder at Fawn Friends

Moxie’s potential as a replacement for expensive therapists is part of the reason why the almost $800 robot is priced much higher than its competitors, Pirjanian said. He said the steep price is largely due to everything under the hood: a camera and sensors to detect and analyze facial expressions, a mechanical body that moves depending on the mood of the conversation and algorithms that screen out any harmful and inappropriate responses. “The technology that's in Moxie is more costly than what you find in an iPhone,” he said.

However, experts say generative AI has not yet reached a stage where it can be safely used for crucial tasks like therapy. “Providing therapy to a vulnerable population like kids or elders is very difficult to do for a human who specializes in this domain,” Druga told Forbes. “Delegating that responsibility to a system that we cannot fully understand or control is irresponsible.”

Then, there’s the privacy question. Other, less advanced versions of these toys haven’t had strong security measures to protect children’s data. For instance, Mattel’s Hello Barbie toy, an AI-powered doll that could tell jokes and sing songs, was deemed a “privacy nightmare” because hackers could easily access the recordings of children. Another doll, My Friend Cayla, raised alarms among privacy experts who found that it could be hacked via Bluetooth and could be used to send voice messages directly to children.

Newer startups have implemented guardrails to protect data privacy. Pirjanian said Moxie’s visual data is processed and stored on the device locally instead of the cloud. Transcripts of conversations are stripped of personally identifiable information and encrypted in the cloud before being used to retrain the AI model. Similarly, at Miko, children’s data is processed on the device itself. Hey Curio cofounder Sallee said that he and his team “take privacy seriously” and that its toys are compliant with the Children’s Online Privacy Protection Rule (COPPA). Fawn Friends does not record or store any data itself but is subject to OpenAI’s privacy policy, cofounder Fitzpatrick said.

Despite these precautions, some parents like Walmsley are concerned about their personal data leaking. Moxie has large round green eyes that follow a person around a room, she said, and the fact that it has a camera that can record everything happening in a room and her child’s emotional responses, makes her “a little uncomfortable.” But, she still thinks it could be a valuable tool for parents with special needs children.

“Seeing it come alive and actually help him regulate his emotions has made it worth every penny,” she said. “It’s done more than some of the therapies that we've tried for him.”
 

3rdWorld

Veteran
Joined
Mar 24, 2014
Messages
43,874
Reputation
3,734
Daps
128,453
Nah, never.
Cacs will hack it and get it to ask your kid for shyt these pedo's ask for..humans are so lazy and useless they cant even interact with a child, they need a robot. The less involved a parent is, the more likely your child will turn astray.
 

bnew

Veteran
Joined
Nov 1, 2015
Messages
58,835
Reputation
8,672
Daps
163,048
Nah, never.
Cacs will hack it and get it to ask your kid for shyt these pedo's ask for..humans are so lazy and useless they cant even interact with a child, they need a robot. The less involved a parent is, the more likely your child will turn astray.

this explains a lot about your comments on AI discussions, you don't understand what you're talking about. :skip:
 

bnew

Veteran
Joined
Nov 1, 2015
Messages
58,835
Reputation
8,672
Daps
163,048



1/11
@altryne
On the topic of our kids having AI friends (long)

I got my 6yo Daughter an AI toy for her b-day that arrived for Xmas instead. She unpacked it all excited, I explained that this isn't like other toys, that this one has AI in it (she of course knows what AI is, seen the things I've built and interacted with them, chatted with chatGPT and Santa mode, knows that daddy is "doing AI" etc')

So a very interesting experiment happened after @_magicaltoys reached out and fixed the issue referenced below (very quick turnaround! thank you!)

So...she played with this Dino, chatted with it, and then... learned to turn it off, and doesn't want it to talk anymore. She still loves playing with it, dressed it up, it now has paper shoes and a top hat that we made together, but every time I ask her if she'd like to chat with it, she says no.

The few times I turned it back on, and she did speak with it, she chatted for a bit, and then just... turned it off again, not wanting to engage at all.

I gently asked why, and wasn't able to really understand where's the resistance, it's not weird to her. In fact, at one point she was pretending the dino was a baby, and it was turned on, so I told her, let's ask it to.. pretend to be a baby, and it obliged and said ok. So we asked it to cry. Granted @_magicaltoys don't yet have the amazing advanced voice mode like openAI in there, so it did it's best but it sounded weird, which made her laugh really hard. it was basically making crying sounds like talking.

And also, there are some technical issues still, the voice is sometimes choppy, so it could be that it's still uncanny for her?

I'm honestly fascinated about why the AI aspect of this toy didn't connect with my 6yo (and btw, I have a very curious 4yo boy, who could give 0 fukks about this toy also, he's very observant, and usually wants everything his sister has, wanted 0 time with Dino)

Is this uncanny valley? a couple of times she didn't want to talk to it, and came to me to whisper to ask me to turn it off (the same happened a couple of times with chatGPT Santa mode btw now thinking about it)

Is this just now knowing what to say? 🤔

It could be the simple fact that she was playing with some other things, and the toy kept wanting to engage and take the attention to itself vs the game that she was playing.

I know that I have a sample size of 1 kid here, and I'm sure many many things will change, as she'll grow and learn to interact with more AIs in different forms, but the first "toy" contact was interestingly almost a complete failure (from the AI perspective, the Toy experience as I said, was dope)

Anyway, this area is def unexplored, will keep you guys posted how this goes and if she shows interest or picks it up again.

[Quoted tweet]
. @_magicaltoys your "toy" finally arrived 3 months after promised, just "in time" for Christmas, but wake word doesn't work, child upset, and support email doesn't exist 😠

I don't go public usually but this got my kid upset and honestly this is exactly the opposite of how a first unboxing should feel for a smart kids toy!

I gave you the benefit of the doubt and preordered blind, and now I have a very fancy plushy expensive door stop 😠


GfkndK8XsAAsPfF.jpg

GfkndK9WoAA0Rnj.jpg


https://video.twimg.com/ext_tw_video/1872090256486326272/pu/vid/avc1/888x720/3k1JzoaNWVlyZW1i.mp4

2/11
@Rahll
When your kid is smarter than you.



3/11
@altryne
Every parents dream



4/11
@MichelleSaidel
I think because it takes away control from the child. Play is how children work through emotions, impulses and conflicts and well as try out new behaviors. I would think if would be super irritating to have the toy shape and control your play- like a totally dominating playmate!



5/11
@altryne
I love that this post found experts like you Michelle! Thank you for chiming in!

It did feel dominating!

she wanted to make it clothes, and it was like, "meanwhile, here's another thing we can do" lacking context of what she's already doing



6/11
@evonce2
AI (in general, but certainly for children) needs to be more playful, less compliant. “Is there anything else…” etc. They’re not 42 and calling a mortgage adviser! The AI should push back more, be contrary. Be weird. Respond with a burp “oops, sorry. What did you ask again?” Etc



7/11
@TheXeophon
A friend of mine has kids in the age of your kids. He generated some coloring pages with image models, yet the kids rather want generic, ones which you can buy/print out.



8/11
@L_E_Bendon
If you've ever tried to have a general conversation with AI you wouldn't be surprised by this. It gets boring very quickly having an AI repeat back what you've just said in six times as many words, never offering any new input except when you ask it for explicit information.



9/11
@SpooxyBoi
One reason kids don't like their toys talking is because the voice can change, die, sound silly (like the talk crying), or just be not pleasant. My sister had an Amazing Amanda doll and once that things voice box started to go, so did it. Plus, kids want to use their imagination.



10/11
@MythSerene
EVERYBODY hates AI. Nobody wanted it. It's an awful alien life form that makes everything ugly and tangled in ways we don't want. It's an intruder forced on us by the same socially awkward but greedy fukks who ruined the internet... but a small child just recognizes a monster.



11/11
@DonIwana
Haven´t you thought about stop playing with your child, and instead of it, hire a subscription monthly fee for a corp to do it for you?

Life is about that, stop doing what you love so you have more time for consumption, "AI is the future"




To post tweets in this format, more info here: https://www.thecoli.com/threads/tips-and-tricks-for-posting-the-coli-megathread.984734/post-52211196
 
Top