this post was submitted on 13 Oct 2023
35 points (100.0% liked)

Solarpunk

5468 readers
75 users here now

The space to discuss Solarpunk itself and Solarpunk related stuff that doesn't fit elsewhere.

What is Solarpunk?

Join our chat: Movim or XMPP client.

founded 2 years ago
MODERATORS
 

This can be the way things are taught, who are the teachers, what a school day would look like, where classes are taught, what things what look like, etc.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 1 points 1 year ago (1 children)

I do think it is valuable to separate knowledge-seeking (where it is good to have access to a knowledge not limited by the parents) and emotional support. Being able to learn without fearing being judged and evaluated, that's valuable. Also as a parent, I do know that my reserve of patience is not infinite and I am happy that my kid finds sources of knowledge that are independent of me and my biases. Then we discuss things.

Overall I think the best way is to allow kids to find their own best ways to learn.

As someone who finished high school at the beginning of the internet, I can guarantee that access to free information unencumbered by the limitations of adults around me (including my loving knowledgeable parents) was essential. To me a virtual AI tutor is just a mean to make this accessible earlier. My kid still has trouble reading long and complicated text, and I rather have a smart tutor proposing him audio content than random youtubers.

[–] [email protected] 1 points 1 year ago (2 children)

knowledge will obviously come from other sources too. When kids socialize with others they will learn things naturally, and discussion should absolutely be encouraged. However AI produces a lot of problems. AIs have bias based on the information they learn, they require resources to build and maintain and cannot discuss information accurately. I just don't see what AI adds over just interacting with other people.

Solarpunk societies, like all post-capitalist societies, are build on strong human relations, replacing one of the avenues of creating them with an hallucinating rock (exaggeration I know) just seems weird.

[–] [email protected] 1 points 1 year ago (1 children)

Things my kid learnt through "socializing and learning things naturally":

  • everything you can't explain is either caused by aliens or ghosts
  • you will burn in hell if you don't go to church on sunday
  • you can read the future on the palms of the hands
  • kids who accept to take the school bus routinely die

AIs have bias based on the information they learn, they require resources to build and maintain and cannot discuss information accurately.

Nonwhithstanding the fact that we are talking about future tech and that the pace at which AI is advancing right now is crazy, even today, on these metrics, I think they do better than we do as a society without them.

  • Bias: we all have, but unlike us, AIs are fixing these problems at an incredible pace
  • Accuracy: Truthfullness is the metric everyone is trying to improve now and already huge successes in a few months. I am willing to bet that you will find more mistakes in the books at the local library than asking a decent LLM. On all subjects that are part of the curriculum until high school, I am willing to bet that it is more reliable than the average human teacher.
  • Discussion: It is extremely good at conversation, something that non-living repositories (books, videos) are incapable of.
  • Resources to build and maintain: You need to build the model once. I am not sure what maintenance you are talking about. Decent models now run on personal computers or even phones. Even the training costs is negligible compared to producing physical textbooks (and renewing them every year!)
[–] [email protected] 1 points 1 year ago (1 children)

The examples you provide are negatively biased. You don't know all of the normal and useful things they learn because they don't stand out. Also two of those examples (Church and school busses) come from current cultural biases, something a solarpunk society would hopefully mitigate.

I think AI is not suited for discussion. It might be good at conversation but discussion isn't just conversation. Discussion requires understanding of others to a degree I don't think AI can achieve.

I concede my point about resources, but will add that the model will get outdated and will need retraining every once in a while.

Textbooks are bad. I agree. I just think they should be replaced with a human that knows what they are talking about and the topics that are learnt are things that the kid actually wants to know instead of what people think they should know.

Also I can't help but notice you ignored one of my core arguments: that solarpunk societies are about strong human connections and replacing one of the main sources of these connections is a bad idea.

I also think that the process of finding information is as important as the actual information. If all of your questions are answered just by typing it into the computer then you never learn the importance of checking information accuracy, accounting for bias and other very useful skills.

AI allows you to shortcut to the information you seek which means you never learn how to actually think for yourself.

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago) (1 children)

Also I can’t help but notice you ignored one of my core arguments: that solarpunk societies are about strong human connections and replacing one of the main sources of these connections is a bad idea.

I'll address that one first then

I have had people tell me that we should not automate cashiers because it removes human connections. I believe that humans want human connections and that if you remove the obligation to get something out of these connections, they will become richer and more meaningful. I sometime feel I am tricking my kid to trigger their interest in "useful" things. I'd rather play with him than force his to rewrite the same word with good pretty round letters a hundred times.

When you remove painful obligations from a human connection, you may sever some that you were forcing yourself into, but you also give room to much more of them. If teaching was accounted for, I would spend more time showing my kid what I really love in life, places that make me feel at peace, events that make me feel alive, techniques that I find interesting despite their practical uselessness.

Human are social animals. We make social connections and may even die for it. Don't worry, removing an obligation will not remove the need for more meaningful ones.

Now for the rest:

Why do you assume that a society that manages to mitigate biases that are extremely central to our current problems in social relations would have a hard time mitigating bias in AIs training dataset?

Discussion requires understanding of others to a degree I don’t think AI can achieve.

Look at what AI mentors do today, right now. Ask them follow up questions, ask them what you don't understand. Several conversations with GPT-4 (which is the best right now but trailed by more open models) convinced me otherwise. Even if you may argue that such models currently are not as good as a human tutor, I find it hard to argue that it does not beat the "conversation" of a class of 30 with a lone teacher.

I also think that the process of finding information is as important as the actual information. If all of your questions are answered just by typing it into the computer then you never learn the importance of checking information accuracy, accounting for bias and other very useful skills.

True in all day and age and on every media. Had to teach it to several kids in my family as school seems to do a pretty poor job at it. Interestingly enough, something that LLMs can teach well as well despite being (currently) pretty poor at it.

[–] [email protected] 1 points 1 year ago (1 children)

Okay, I agree. LLMs might be useful for education. But they should not replace practical experience.

I just am skeptical to everyone that says we should replace something with AI, because most of the time these decisions seem to be motivated by profit and aren't actually better.

[–] [email protected] 1 points 1 year ago

I think 99% of the companies that hope to replace something with AI for profit are going to fail, as these models become lighter and lighter and have open source equivalents and a VERY motivated community behind them to prevent corporate lock-in.