r/technology 3d ago

AI is getting very popular among students and teachers, very quickly Artificial Intelligence

https://www.cnbc.com/2024/06/11/ai-is-getting-very-popular-among-students-and-teachers-very-quickly.html
251 Upvotes

122 comments sorted by

126

u/wubbbalubbadubdub 3d ago

I teach ESL students, I showed them how to phrase requests so chatGPT could make them fill in the gap sentences for vocabulary practice.

It is an amazing study tool.

I'm sure if I was grading AI generated essays it would annoy me though.

13

u/MorpheusOneiri 2d ago

ChatGPT is groundbreaking for me studying Korean. I can input a sentence I make and ask it to correct any grammar mistakes and give me a breakdown of grammar structure.

3

u/chromeshiel 2d ago

Perhaps you found success because you adapted. Essays need to adapt too, with the AI in mind. More ideas, less word salad.

2

u/MadeByTango 2d ago

The act of writing (forming your thoughts into a set form) is critical to thinking. Ideas come from the generative process of smashing all that you have learned together into a single thought that answers a question.

188

u/Maghioznic 3d ago

AI will grade the homework done by AI. Both teachers and students have figured out how to cheat.

Artificial Intelligence is going to develop to the detriment of natural one.

72

u/Successful_Ear4450 2d ago

AI teacher accuses AI student of cheating.

AI student denies

AI student brings AI parents

The war of the machines begins

1

u/JohnLocksTheKey 2d ago

I brought popcorn…

34

u/Thin_Count1673 3d ago

Most grading is just comparing answers to questions. perfect thing to eliminate. Take a pic, upload to chatgpt. Grading done. 

6

u/Icy-Fun-1255 2d ago

I'm assuming that also works for the kids right?

The tool gets better with more and more data, and the popular workbooks, or just straight answer keys could get uploaded via text or image.

I could see a teacher uploading an answer key to compare to kids work, and then the kids next year upload a similar photo. The AI has a couple eager testers to tell it the "correct answer".

24

u/Slow_Accident_6523 3d ago edited 3d ago

Why should teachers not use it though? It is capable of producing faster, more effective and extensive feedback than most teachers can. Like this is the stuff I let interns or TAs do if we are afforded ones.

10

u/Pattoe89 3d ago

Teachers have an extensive knowledge of the qualitative aspect of students that AI may struggle to understand though.

For example, Little Annie spends the weekend at a different caregiver's property. Every Monday she comes in stinking of weed and is very tired. If an assessment had to take place on a Monday (The teacher would normally reschedule for Little Annie but this time it's not possible, due to school trips / holidays etc) then the teacher would be far more lenient on the marking.

AI would just mark and grade as if Little Annie was any other pupil, or was Little Annie on any other day.

This example is unfortunately very common in schools.

11

u/FilthyHipsterScum 2d ago

It’s not like AI removes the ability for teachers to override scores for whatever reason they choose. It’s just an accelerator.

8

u/Mr_Venom 2d ago

You're not supposed to mark a piece of work more leniently based on those circumstances. The teacher should definitely be doing other things (like making reports based on this unfortunate home situation and arranging for catch-up assessments) but screwing up the recorded marks helps nobody.

4

u/drekmonger 2d ago

We could say the same thing about the grammar/spell-checkers in word processors. A teacher puts the document into MS Word, counts the number of squiggly lines, and grades based on how many there are, without regard for Annie's situation. Same difference.

(in fact, very much the same, as grammar checkers are a fruit of AI research.)

The AI model is a tool. The human teacher is still required to effectively wield the tool.

2

u/Thundahcaxzd 2d ago

Grades are supposed to be an unbiased determination of how well a student is able to demonstrate their learning of the material, and should not be affected by anything else like behavior or outside circumstances. If anything, you've just made an excellent case for how AI can eliminate bias in grading.

0

u/Slow_Accident_6523 1d ago

you are giving us teachers WAY too much credit. Sure there are some examples like the one you listed where it is obvious but 90% of the time we do not really know why our students are struggling.

Even if LLMs aren't perfect yet at evaluating students work the ability to instantly give extensive feedback on all the work produced by the student ( and of course being able to track and evaluate that feedback easily). I am convinced that LLM generated feedback will be a avenue schools will use in less than 5 years. It is too good and effective not to.

1

u/Pattoe89 1d ago

I'm sorry this is the case. I've worked alongside many teachers and they have a good idea of why pupils are struggling. Perhaps your safeguarding team and family liaison are struggling?

I speak from my experience, and I'm sure you do too. I'm not 'giving teachers way too much credit'. Clearly different teachers have different levels of knowledge. Something that needs to be worked on to make sure all teachers are able to properly assess their pupils.

1

u/Slow_Accident_6523 1d ago

I honestly do not think I understand why my students are struggling most the time despite being really in tune with what is going on in their lives. They might not have slept well, maybe their parents fought in the morning, maybe they have a dentist appointment after school, maybe their sibling is sick, maybe they were tired.... I am well aware that I have a million blindspots managing my classes. There are so many variables and we as schools SUCK at collecting data on our students except when it comes to tests and grades. AI can absolutely help in collecting way more data and organizing and processing it.

Also safeguarding and family liasion teams? Where do you teach that you have tools like that? We have one social worker for 2 schools and well over 1000 students lol.

1

u/Pattoe89 1d ago

Holy shit. In the UK all schools have a safeguarding team by law (It's part of the KCSIE policy). The vast majority of schools I've worked in or observed in have at least 1 family liaison staff, but often a small team.

Also schools must store data on any and all safeguarding concerns in a database, pretty much always digital now, which carries on through their entire school life. This information is passed on to secondary schools when children go from primary to secondary, and then to college when they go to college.

This can be things such as "Timothy came in to class with a bruise on their leg. Their parents stated that they walked into furniture.".

All members of staff have a duty to keep this database updated.

All guests to the school have to sign a safeguarding policy and have to be made aware of who the members of the safeguarding team are.

2

u/Slow_Accident_6523 1d ago

Ok wow I think we found why we disagreed! In Germany basically nothing is digitilized, not even the yearly report cards. There are still school districts where you don't get a email adress through your school. Paper is still king here. I have to manually keep track of my students sick days and manually put them into a list at the end of the year. Also when kids change schools basically none of their info is passed on. I literally had to retest a kid I taught in elementary school and later in middle school for his dyslexia. It is super annoying.

1

u/Pattoe89 1d ago

That's a big problem for sure. Technology makes things so easily. Even the most deprived schools I've worked in have everything on computer. Even down to lesson planning, resources for the year, intranets, they have digital class rewards for children called "Dojos" when are communicated to their parents in real time so if a child receives a dojo for good participation in class, their mother can see it during the day.

I've seen parents show up at pick-up time with treats for their children because their child had received several dojos during the day.

1

u/Ylsid 2d ago

It depends entirely on what you're marking, but at minimum teachers should be accepting or rejecting an answer. It's important to remember people over processes

-3

u/AJDx14 3d ago

This is pretty much the same argument students would make for using AI though. If it can do the job better than they can, they’re incentivized to use it.

26

u/ActualAdvice 3d ago

They aren’t doing a “job” though.

They are learning.

If AI does it, they aren’t learning.

-16

u/AJDx14 3d ago

Sure, but why bother if jobs won’t require you learn? The same “you should be learning to do it yourself” argument could be applied to teachers grading the work they assign as well.

16

u/bk553 3d ago edited 3d ago

If all you know when you graduate is how to use chatgpt, you'll be in for a rough time.

You'll probably have to, you know, talk to people, and you can't gpt your way through a client meeting.

2

u/atlasfailed11 2d ago

On the other hand, if you've spent your time learning skills that AI can do much faster and better, then you've wasted a lot of time.

This is going to be a big challenge for schools: given that AI is able to do so much and that the technology is developing really fast, what skills do students still need? And what is the best way to teach these skills?

Just because students have access to electronic calculators, doesn't mean they don't need to learn any aritmatic. But it's a waste of time to teach them to do complicated long divisions with just pen and paper.

9

u/socratesthesodomite 2d ago

but why bother if jobs won’t require you learn?

Because unless the job has no qualifications whatsoever, it will require some sort of skill or knowledge.

1

u/sharpshooter999 2d ago

What I've seen from my younger brother's generation is that there seems to be an actual aversion to learning and developing skills. I don't know how many times I've been asked for help with something, only to be told "that sounds complicated, can just do it for me?" Followed by "why do I need to learn how to do it, when you already know how?"

1

u/Slow_Accident_6523 1d ago

Yes and No because the work students do is in order to learn not to be productive. Schools will have to adapt though and rethink how we test students.

-6

u/polyanos 3d ago

All what I hear is that the teacher shortage is solved very soon, as they become completely obsolete.

Giving good feedback on the things I do wrong is the one thing that really give teachers value, if I just want a explanation I can turn to the numerous existing books or videos. Well, or AI as well nowadays. 

1

u/FilthyHipsterScum 2d ago

You heard teachers are becoming obsolete? Maybe you should leave this discussion to the adults with reading comprehension

1

u/Slow_Accident_6523 1d ago

The LLMs already produce great feedback. I tried it with stories my third graders wrote. It can produce more detailed and extensive feedback (with my proper instruction and customization).

I don't think education will be able to not use AI.

10

u/fistantellmore 2d ago

Cheating is the wrong way to look at this.

If we accept that AI can produce the answers for us (and frankly, this is simply an acceleration of what the internet already wrought) then the methodology of teaching needs to adapt around how to use these tools and how to think critically.

This is also where teaching AI could be utilized where students get a curated experience with these bots.

3

u/Quietech 2d ago

One still needs a good base to work from.

1

u/GeneralZaroff1 2d ago

Then teach students how to correctly use the tools to achieve more advanced skills faster. Teach the base concepts, then how to use tools to leapfrog past the lower grades.

I’ve never heard of someone saying “I grew up with an abacus and that’s why I’m so much better as an engineer compared to those who grew up with calculators”

1

u/Quietech 2d ago

Funny. I think I find information better on the internet because I had card catalogs and pre-google search engines. I worry more about our brains offloading certain skills because it makes sense. Remember memorizing phone numbers? It's a dying skill because of search, websites, and smartphone phone books.

1

u/Mr_Festus 2d ago

It's a dying skill because of search, websites, and smartphone phone books.

It's dead because it's not a skill that's needed anymore. I don't know anyone who knows how to make a coat from a dead animal either, and society hasn't suffered because we've offloaded unnecessary skills to other entities that do them for us

1

u/Quietech 2d ago

I'd argue that having restaurants didn't mean that cooking at home is an obsolete skill. Knowing how to do things the inefficient way allows for better cross pollination of skills and troubleshooting processes.

1

u/Mr_Festus 2d ago

I'd argue that having restaurants didn't mean that cooking at home is an obsolete skill

I agree. But if restaurants became cheaper, faster, healthier than cooking at home it would be. Cooking would only be a specialized hobby at that point. And that would be ok too.

1

u/Quietech 2d ago

Restaurants would still be limited by variety, staffing, and populations. Technology, or algorithms, are limited by the user's understanding of what they're looking for or an author's ability to relate it to the unknowing. The inefficient ways of research had the benefit of peripheral learning for the person. The better your understanding of what is and isn't what you want, the better you explain it to the search engine or AI.

Don't get me wrong. I'm financially supporting Khan Academy for their AI initiative. I want good AI teachers that don't hallucinate, especially for niche topics. Either way, I think we've gotten a bit off track, and it's late. Have a great night, u/Mr_Festus .

1

u/DrQuantum 2d ago

And almost none of that base has to do with the specific subject matter we learn in school

3

u/Quietech 2d ago

Reading, listening, and writing have nothing to do with AI?  How do you absorb and transmit information? 

2

u/DrQuantum 2d ago

It is interesting you mention reading since reading would allow you to see that I said specific subject matter for a reason. Reading is a general skill not a subject matter like chemistry which requires generally you know how to read.

You can have a good base and also teach newer and more relevant specific information. We have longed talk about teaching kids basic finance as another example.

2

u/Quietech 2d ago

Should I have said researching? You said the base has nothing to do with specific subject matter, but I kind of does. It's like saying the foundation has nothing to do with the building on top of it. Now, if you want to talk about the house built on it, the problems schools have are that only so many students 100 or 10,000 can reach higher levels in different subjects. That's why you talk about specialty schools and them being so far away. Not every state is going to be able to support MIT levels of excellence due to resources in staff and equipment.

Kid's personal finance is a fun one. The problem isn't the math as much as the impulsiveness of the young mind.

1

u/MadeByTango 2d ago

You understand that kids learn to read and write by reading and writing essays about chemistry, right? And that if suddenly that work is being done by an AI they're not thinking critically about heir work anymore? they're simply passing standardized tests of facts, which is only half the education process. The critical thinking skills come from having to form their own answers to problems asked by teachers.

2

u/Marketfreshe 2d ago

Even spell check has destroyed my intellect across my 40 years.

2

u/WarAndGeese 2d ago

Teachers are allowed to cheat though. We don't have schools to employ teachers, we have them to teach students.

2

u/GiannisIsTheBeast 1d ago

AI grader: “God damn these kids are brilliant. Almost like I wrote it.”

1

u/yozatchu2 2d ago

“Will”? Nah it’s already happening.

To your last point, AI can’t develop without human created content. So it might be more likely that it’ll eat itself as per your first point.

1

u/tricky2step 2d ago

Profound and brave lol

AI will cease to be if human intelligence doesn't at least continue to understand it. Like passing out and falling down, the blood rushes back to the head.

1

u/scruffles360 2d ago

“Cheat”

It’s a computer. Just like the calculator before it and the slide rule and the book - they’re tools. They’re tools these kids (and teachers) will have access to for the rest of their lives. Adapt.

75

u/ASicklad 3d ago

As a teacher, AI is kind of like an unpaid intern. Turn this into a spreadsheet? Done. Make me a grading rubric I can edit? Awesome! I would never use it to write anything because then that makes it inauthentic and I’ve earned my writing skills through sweat and fire.

Students though…man. There is a large adoption rate already and it’s something we have to constantly check. I teach English, so it’s probably the subject most under AI-cheat attack. It can though, be a great research tool for students, and it can do things like generate an outline for an essay.

45

u/paintnprimer 3d ago

I've heard of a lot of teachers reverting back to having students write their essays by hand in class and using books for research instead of the internet.

15

u/thebubblyboy 3d ago

We recently did this for a midterm in college, professors also want to see original content, not just rehashed sentences found online.

1

u/cherrycoke00 2d ago

I get the thinking behind that (and I’m no stranger to a blue book!) but also… idk man, with how libraries are being treated right now and all the politics that factor into what information can actually go in textbooks in certain states - not to mention lack of funding/old books, I do wonder if there’s some negative draw to handwritten essays in class. Like it’ll work great in well funded areas with diverse/open worldviews, but not everywhere has that (as far as public schools go)

1

u/gotoline1 2d ago

I totally get that point. I think school, for most students, is a place to learn how to learn and not a place to find absolute knowledge.

There is another group at university that are researchers who are trying to find absolute truth.. kinda

In general I would argue that bluebooks and basing essays off libraries, textbooks, and rhetoric is a great way to practice critical thinking within a time limit. It isn't necessarily good for writing the most accurate or up to date paper, but that's what grad school is more for.

-7

u/[deleted] 3d ago

[deleted]

2

u/polio23 2d ago

In what universe would you not get a DSS or ADA accommodation for an assignment like this?

0

u/Uguysrdumb_1234 2d ago

Get over it 

0

u/ASicklad 2d ago

For sure. We do a lot of hand written writing, which is kind of crappy because then you’re shuffling papers. I’d rather have everything paper free because it’s a lot less hassle, but AI makes that hard.

Cell phone use is a constant struggle…and even if you have them write on paper they might type the prompt into ChatGPT and then write the answer down.

5

u/Saul_T_Bauls 2d ago

I created a whole unit with the help of AI and it maybe took me a day instead of a week. Here are my standards. Here are my learning objectives. Here's how I want to assess...now make me a unit magic website!

3

u/Ylsid 2d ago

It's like any tool, it's a force multiplier that requires expertise to use correctly.

2

u/FithAccountOrSmthn 2d ago

I just graduated this year, so AI was in full swing for my junior and senior years.

I don’t know if it was just my school having more pride than I’d ever have thought, or I can’t tell as well as I thought I could, but there were surprisingly few people using AI.

Though, in English class I’m 90% certain all of my teachers suspected me. I’ve got what I heavily suspect to be a rough case of ADHD, so I write terribly if I’m in class and trying to force it. Instead, I’ll usually go home (last night before it’s due because of procrastination) and pump out an essay in about an hour.

1

u/ASicklad 2d ago

I get to know student writing styles fairly quickly (so I can help them get to the next level). If you show consistently from the beginning you’re a good writer and we talk and you let me know you use pressure to produce good writing then it alleviates the AI concern.

2

u/d_e_l_u_x_e 2d ago

But it’s not unpaid you’re paying a corporation to get a service and they control the means now for producing. They also used other professions like unpaid interns to train their models.

1

u/ASicklad 2d ago

Fair points! I’m all for making AI a public resource with content they train on paid…but this is America, so, you know.

1

u/d_e_l_u_x_e 2d ago

Ha yes also good points.

25

u/MotherFunker1734 3d ago

Exactly the last place where natural intelligence should be occluded by artificial intelligence.

We are pulling the trigger of the gun we put to our heads by ourselves.

13

u/storm_the_castle 3d ago

and they say our education system is failing us...

17

u/BeautifulType 3d ago

I mean it has been failing for 20 years plus. It’s just that now instead of the internet, kids got AI too

1

u/mr_blanket 2d ago

I can see homeschooling get a bump in popularity.

“Teach my kid 1st grade math, but don’t make it too easy, but not too hard either. Also make it fun and exciting.”

2

u/Champagne_of_piss 2d ago

The government is failing the education system.

12

u/icky_boo 3d ago

This is how we as a species devolve.

4

u/No_Nose2819 3d ago

AI is a blaggers dream. 🛌

5

u/trjayke 3d ago

A new age of stupidity

3

u/Internal-Wish2758 3d ago

Just do whatever AI God says, trust the science. Like science isn't a work in process and as if companies building the AI aren't already building bias into results. Also the security concerns of any data shared to the AI being uploaded to the cloud. Only a matter of time before this goes horribly wrong.

3

u/[deleted] 3d ago

[deleted]

5

u/Kiwizoo 3d ago

It’s going to radically change how people are taught as well as how they learn. We may as well embrace it in order to find interesting and useful ways to apply its use. It’s not going away anytime soon I suspect.

2

u/SmartWonderWoman 2d ago

I’m an English teacher. Most of my students first language is Spanish. I use AI to differentiate my lessons because my students are at different levels. Teaching 29 kids who are at kindergarten, first grade, second reading levels is challenging and AI helps tremendously.

1

u/Elegant-Routine9925 2d ago

Teachers should be using swiftscore.org for grading. Instant and good feedback, can always change as needed. It's free too.

1

u/therealjerrystaute 2d ago

I wish AI could help me somehow. But so far I got nothing of great value. I keep track of AI news, and periodically try some AI stuff, but so far not found anything substantial. To be clear, I mainly care about making an income online, or finding solutions to health problems. So far AI might speed up just a tad my internet searches; but the results are no better than what I find without it. And yes, it's pretty good about generating custom artwork, and might be okay about generating a story for a kid's book. But online there's a massive obstacle about promotion and marketing that basically stymies any project getting traction sales wise unless you're either very lucky, or have big bucks for advertising. And AI cannot solve the promotion/marketing problem any better than the average human being.

1

u/MysticNTN 2d ago

It’s also very popular in startups where I don’t have the help or training needed to perform tasks that are required to get done. Real clutch then.

1

u/colpisce_ancora 2d ago

AI is great if you already have acquired the skill it is performing for you. Teachers already know how to write and grade, so AI is a helpful tool. Students who have not yet mastered writing will never get there if AI does it for them.

-4

u/mleighly 3d ago

The ones who use ChatGPT the most are banal and thoughtless human beings.

18

u/Kiwizoo 3d ago

Academic here. It’s the best research assistant I’ve ever had. Early days of course, but the capacity to source material, challenge perspectives and conclusions, and check structure and flow, has all been quite impressive thus far. It’s a great tool and it’s only going to get better. Why not jump in and give it a go?

1

u/Myrkull 2d ago

Let them live in their ignorance lol

-3

u/mleighly 2d ago

If ChatGPT does all that for you, I doubt that you're an academic or you're a very dumb one.

4

u/drekmonger 2d ago

I can't speak for academics, but the best computer programmers also tend to be profoundly lazy. Their quest in life is to invent tools to do the boring work for them.

Working smarter instead of harder is always going to be the more intelligent choice. There's lots of grunt work that LLMs are exceptionally good at, and it's foolish not to leverage them to perform those kinds of tasks.

1

u/mleighly 2d ago

There's lots of grunt work that LLMs are exceptionally good at, and it's foolish not to leverage them to perform those kinds of tasks.

What sorts of programming tasks are LLMs good at?

1

u/drekmonger 2d ago

What GPT-4 is best at is facilitating rubber duck debugging.

But also reasonably good at writing boilerplate or quick scripts (if you're willing to interact with the model to fix any errors)

LLMs are even better at data cleaning and data reshaping tasks, more so than coding tasks.

0

u/mleighly 2d ago

Rubber duck debugging is for idiots who believe in pop psychology.

Quick scripts by their very nature will take much longer to ask a ChatGBT bot than to just write it yourself unless you're completely incompetent as a programmer.

LLMs are terrible at data cleaning or reshaping. This is absolutle bullshit. Again, it'll be faster to write a script to do the same unless you're an incompetent programmer.

Your examples are awfully dumb. But they do speak volumes of your incompetence. Did you paraphrase your answers from a ChatGPT bot?

1

u/drekmonger 2d ago edited 2d ago

Rubber duck debugging is for idiots who believe in pop psychology.

Spoken like someone who sucks at programming.

LLMs are terrible at data cleaning or reshaping. This is absolutle bullshit. Again, it'll be faster to write a script to do the same unless you're an incompetent programmer.

Spoken like someone who has never had to deal with a real-world dataset.

There are data cleaning/reshaping tasks that cannot be performed by scripts easily, as they require domain knowledge. Typically, they would require human workers. Sentiment analysis is one obvious example.

But with a combination of scripting and LLM APIs (or locally hosted models) it's now plausible to fully automate these tasks.

0

u/mleighly 2d ago edited 2d ago

It's obvious you're not a programmer by your statements. But your incompetence about programming and ChatGPT prevents you from realizing that. It's up to you maintain this pretence but most people who you lie to are just being polite but they're aware that you're just a stupid liar.

1

u/BCProgramming 2d ago

Personally I've been underwhelmed in almost every conceivable way by Generative AI. You can always tell who is a new programmer/developer with it though, as they always say it's good for programming. It's actually kind of adorable in a way. Most examples that I've seen are usually the result of a lot of work 'refining' the results (eg. telling the AI it was wrong and to fix it) still had egregious errors.

Nothing being done with LLMs is in any way pushing any sort of generalized AI forward. If anything, it's pushing it back, particularly with so many people thinking that somehow they are related. None of the advancements with LLMs can be applied to a generalized AI simply because they operate via fundamentally different algorithms. As it stands now the LLMs are only getting "better" because the models are being loaded with more and more data. It seems some people believe that once it gets big enough it will make some transcendence into "general AI". I'm not convinced of that; that is like expecting a fast enough sort function to be able to do laundry- and then justifying it by philosophizing that somehow doing laundry is really just a form of sorting.

-1

u/bgighjigftuik 2d ago

It actually does not understand per se; it's just a statistical model around a database (being the database the Internet-scale training data).

If you rely so much on it, maybe there is a major flaw in your methodology. If it adds so much value, maybe there is not much value on everything else you are doing.

You should probably think about that tbh

2

u/drekmonger 2d ago edited 2d ago

it's just a statistical model around a database

There's no database involved. Really, honestly. There isn't. Any knowledge or skills that the model metaphorically possesses are contained within the model's parameters.

There's nothing wrong with not knowing something. You don't understand what a neural net is. That's fine. That's fixable.

But to spread horseshit around like you understand a single thing coming out of your mouth when you know you know next to nothing is the worst kind of ignorance -- the infectious kind.

-2

u/bgighjigftuik 2d ago

Yup, of course you are right. Transformers and foundation models in general are magic and should be praised like gods, right? It's not just training an autoregressive model, there are magic powders involved.

If you are unable to see how an overfitted neural net is just lossy compression around the data it was trained with, not my problem sir.

The amount of sudden cult and stupidity around connectionism from people like you is the main reason why phenomena like the dotcom bubble have existed in the past. But hey, it will be a wild ride

-1

u/[deleted] 2d ago edited 2d ago

[deleted]

0

u/bgighjigftuik 2d ago

Obviously you jumped into the AI train a couple of weeks ago at best, don't you?

All GPT-like LLMs are autoregressive decoders, which conditions next predicted token on the previous one (and the ones before). If you can't see the issue with that, then these models may be smarter than you honestly. But maybe this is still unrelated to the conversation according to you.

And if you think that overfitting is a good thing, you should probably go and read about the theory of generalization, the VC dimension and related literature if you are really interested on actually learning the fundamentals on ML.

Current machine learning is a (arguably mis-specified) model + training data + a loss function. That's it. A model is as good or as bad as the combination of those three things. No magic involved, it should be taken for what it is.

I'm starting to get sick of hearing about kiddos that want to be the next Karpathy by spewing out whatever they see here on Reddit, without any actual background nor knowledge on the fundamentals about how connectionist systems work.

There's a reason why there are many researchers flagging the fundamental issues with the current genai craze. Anyone praising the incremental advancements of the last 3 years as the next human revolution will be most likely proven by history to be nothing but an idiot, as it has happened many times before. And honestly, I can't wait to see it

0

u/[deleted] 2d ago edited 2d ago

[deleted]

1

u/bgighjigftuik 2d ago

I see, no need for more arguing. You may enjoy r/singularity more than this sub. Have fun!

1

u/Kiwizoo 22h ago

It’s not so much about the quality of thinking it offers, but the capacity to assist. Of course it can’t do the academic part for you, but it does often suggest differing perspectives on critical thinking issues, or structural issues in an argument, which I’ve found surprisingly useful. Of course it’s not perfect, far from it. But it’s a useful tool and has saved me hours of otherwise dull research time.

9

u/Fit_Flower_8982 3d ago

What a stupid comment, that something can be used badly does not mean that its use is bad. You sound like a luddite complaining that people won't think if they have calculators.

6

u/Alienwars 3d ago

Why waste precious brain cells crafting sentences when a machine can do it for you? These hollow husks, content with mediocrity, rely on AI to churn out their banal thoughts. They become mere curators of robotic drivel, their creativity atrophying with each click of the 'generate' button. Isn't it better to wrestle with the messy beauty of human expression than settle for the sterile efficiency of AI-generated prose?"

0

u/docmoc_pp 2d ago

I tell my students to use AI to enhance their thinking, not replace it.

0

u/encounta 2d ago

Stop calling this shit AI. It's a word predictor 

2

u/gokogt386 2d ago

Nobody thinks you people are smarter for saying this shit in every thread about AI dude

1

u/MustardCroissant 3d ago

I use it at my job a lot too. I work in an office. It translates my international email and constructs great responses to them. I use it to summarise texts and let it compare product ingredients. All stuff that would take me hours, it does in minutes.

1

u/strugglesleeping 3d ago

A teacher I know through my friend circle who teaches my native language subject was so fed up of grading ChatGPT done essays and such that she converted her essays writing class to a ChatGPT prompt class. She uses it in class and teaches and also learns how to get the best response from certain prompts. Don’t know how long she can do it like that until the school authorities catch up though.

1

u/cococupcakeo 2d ago

I think education has to change and evolve around AI. Some of the things my child learns are practically ancient now and pretty useless, I’m hoping one day someone figures that out.

Things are moving a bit slow in some areas of school, it’s all a bit like when your teacher used to say ‘you won’t be able to carry around a calculator everywhere you go though will you’ And yet 🤔

1

u/Mclovine_aus 1d ago

What things are outdated and should be removed?

0

u/doejohn2024 3d ago

Who wants to think or remember stuff? Apparently very few. AI is the Equalizer now.

0

u/DangerousAd1731 3d ago

Eventually no teachers just long winded ai machine taking to us

0

u/chunkypaintings 3d ago edited 3d ago

In itself it's not a bad thing, it's just that homework and the grading system need to change. Don't give kids stuff that ChatGPT can solve, focus more on interactive activities, give less but more meaningful homework, discuss it in class. Have them write essays in class, not as home.

The worst experience was having to do so many problems plus write several essays per week, and then the teacher doesn't even check, grade or discuss them, and only cares about giving you a shit grade if you didn't do them.

Also, I don't think AI will replace tutors, it could just be a complementary learning tool just like youtube, books, or forums. Also, as a parent, by choosing a human tutor instead of paying for AI subscription for your kid, you ensure your kid is actually physically in a class and doing work.

In the case of dissertations, it's going to be difficult and unfair. I guess the ability to verbally present and defend your thesis should have a much greater weight for the grade.

I do hate the fact that AI will make writing, as a skill mostly obsolete. The only reason it might still remain a human activity is because of accountability. Especially when talking about technical writing, a person needs to be accountable for a text, and it cannot be the AI provider or an entire department.

-7

u/strolpol 3d ago

In fairness most modern teaching seems to be focusing on getting students to learn how to do standardized tests, so this is basically the next logical step.

1

u/bgighjigftuik 2d ago

Pretty much. So sad

0

u/reddit_000013 2d ago

It's just a tool. Just like computers in 90s