Individual attention is good. And AI as a tutor can be helpful because they do have infinite patience.
That being said, genuine curiosity plays an important role and if a kid isn’t curious about a subject, AI isn’t going to help with motivation.
What’s much more likely is that (eventually) AI Teachers and AI Doctors are going to be the best we’ve ever had. No human, not even the parents of only children, can lavish the time, expertise, and attention these AIs will give your child.
No, that’s pretty unlikely. They have time and attention, but not really expertise. They have good command of straightforward knowledge, but just imagine the shitshow that would be explaining the politics of the American Civil War. Or Vietnam.
Yeah, AI knows what a gerund is and how to calculate the area of an ellipse, but it will struggle with more philosophical topics that don’t have a clear cut right and wrong answer.
Hallucination is a thing. It’s a problem because you can’t know when AI is hallucinating or not. But there are a vast number of grade- and high-school level things that it won’t. Like yeah you can’t ask how many footballs long is a hockey rink, but you can ask it how to go about solving the question for yourself, and it will answer, which is what you want the AI to be doing anyway instead of trying to solve the problem.
Individual attention is good. And AI as a tutor can be helpful because they do have infinite patience.
That being said, genuine curiosity plays an important role and if a kid isn’t curious about a subject, AI isn’t going to help with motivation.
No, that’s pretty unlikely. They have time and attention, but not really expertise. They have good command of straightforward knowledge, but just imagine the shitshow that would be explaining the politics of the American Civil War. Or Vietnam.
Yeah, AI knows what a gerund is and how to calculate the area of an ellipse, but it will struggle with more philosophical topics that don’t have a clear cut right and wrong answer.
AI is exceptionally good at spouting facts that look real but are actually bullshit.
Hallucination is a thing. It’s a problem because you can’t know when AI is hallucinating or not. But there are a vast number of grade- and high-school level things that it won’t. Like yeah you can’t ask how many footballs long is a hockey rink, but you can ask it how to go about solving the question for yourself, and it will answer, which is what you want the AI to be doing anyway instead of trying to solve the problem.