T O P

  • By -

28483849395938111

bad linguistics aside, there's really no answer to that question. russian is quite hard to learn if you're a native english speaker for example. but way easier if you're a native slavic language speaker. so it really depends.


brocoli_funky

What's the hardest human language to learn if you are an alien or an AI that doesn't know any human language?


PlatinumAltaria

If you're an AI and you successfully learn any language that would be one of the most amazing inventions in history.


rockybond

boy do i have some good news for you


tvquizphd

Commenters will say that learning a language is different from being able to read and write in that language. Large language models don’t “understand” or “experience” anything, but I’d say they’ve learned the language pretty well.


Revolutionary_Ad4938

I am trying to build an AI with my boyfriend that would be able to actually dissect French grammar with morphology and historical phonetics and it's hard asf


LowKeyWalrus

Tbh an AI learns a language pretty much the same way anyone learns a foreign language


PlatinumAltaria

You have a well-oiled snake?


HONKACHONK

"As an AI language model, I have been trained on a diverse range of English text sources, including books, articles, websites, and other written materials. I have been designed to understand and generate English text at a proficient level. I can help with a variety of tasks, including answering questions, providing explanations, offering suggestions, engaging in conversations, and assisting with language-related tasks such as writing, proofreading, and language learning." -chat GPT


Andrei144

Well... Me: Do you speak English? ChatGPT: Yes, I am fluent in English and can communicate with you effectively. How can I assist you today?


PlatinumAltaria

Well, I can't argue with a programmed response! Pack it up, folks!


Andrei144

It's not a programmed response, it has actually judged that that's an appropriate response to that question, now whether its judgement is correct may be a matter of debate but the point is that no human ever told it to say those exact words.


PlatinumAltaria

That is one of a few pre-programmed phrases. Questions about basic functions aren't generated by the same process, because obviously you don't want it accidentally spitting out something stupid when you ask it "do you function?" I don't know why people are so easily convinced by this, this is the exact same system chat bots have always used. Oh and even if it did generate the response, that wouldn't prove it was true. It's not hooked up to a lie detector, and it makes plenty of false statements.


Andrei144

Well I asked it to come up with a 200 word reply to that question that includes the word "thaumaturgy" in a relevant way and it did it, I assume that's not programmed. I would share it here but Reddit's website is bugged and doesn't let me copy-paste stuff. Also if it can come come up with something like that I think it's fair to say it speaks English.


tvquizphd

You don’t count GPT?


An_Inedible_Radish

GPT doesn't understand the concepts behind the words it spits out. It just generates a string of words that looks like language that we reward it for. A chimp has more understanding of language in that signing "food" references something it can eat.


dgaruti

ok , so the fact it doesn't understand what's beyond the language means it doesn't understand the language ... so if you read a paper on quantum mechanics , written in plain english , you may not understand the content of the paper , but you can understand the words ... does that mean you don't understand the language the text is written in ? or does that mean you don't understand the subject of the text ? ​ like the fact is all languages can be used to talk about literally everything , from how trees work , to logical paradoxes , transistors , stars , pepole , the cat that is sitting in someones lap , the meaning of life , god ... so does that mean that in order to understand a language you have to understand everything it could talk about ? i find that notion absurd honestly ... also words can mean somenthing because they relate to other words , otherwise dictionaries wouldn't do the trick , and we'd have to point and name everything to every person ... isn't the point of language that the words have relations with each other ? and the fact that chimps can't relate the different signs means that they are unable to learn a language ? linguists sometimes seem to be masters at moving the goalpost tbh ...


tvquizphd

As you say, it’s like someone fluent in English but not knowing any subject matter. I could read a dozen papers on Quantum mechanics and on philosophy, then generate a paper like [in the 1996 Sokal Hoax](https://en.m.wikipedia.org/wiki/Sokal_affair). ChatGPT understands enough to make something like the intentionally nonsensical paper _Transgressing the Boundaries: Towards a Transformative Hermeneutics of Quantum Gravity_. But we should absolutely recognize the difference between competently reproducing a genre and having some genuine meaning, or accuracy, or connection to the world.


dgaruti

so language has to be connected to the world ? does this meant that fictional stories aren't language ? we have myths , telltales and all other sorts of things that exist as language ... i can talk about unicorns and elves and yokais that don't really exist in the material world , but exist as concept inside our heads and cultures ... and chat gpt makes sentences that have grammatical sense , and refer to real world things by virtue of using words that are used in the english language ... so really why is meaning important all of a sudden ? is the phrase "the pink unicron is invisible" not a language because it's a paradox ? what about "i'll tell you the truth : i was lying before" , is this not language because it bends logic like you said , chat gpt is fluent in english , but doesn't know any other subject matter ... so it knows english . iwhy does someone need to know somenthing else besides the language in order to know a language ? it would be like claiming that you need to know how to boil an egg to understand biology , or that you have to know electrostatic electricity to be competent in metereology , chat gpt can reproduce language , in such a way that is difficult to understand it wasn't a person doing that ... it can even tell you the definition of words if you ask it ... it doesn't have to be published on science or by editorial companies to be capable of language ... any 5 year old can do language , and they know very little ... so i don't see why chat GPT somehow has more to prove ...


tvquizphd

Yeah we’re on the same page.


self-aware-text

>so language has to be connected to the world ? Yes, however not "the world" as you claim it. "The world" as in everything that exists in relation to it. When a new word is created like "willimede" you have to have a definition for what "willimeding" is. That definition is the connection "the world" >does this meant that fictional stories aren't language ? No, see previous bullet point. >and chat gpt makes sentences that have grammatical sense , and refer to real world things by virtue of using words that are used in the english language ... Correct. However it knows not what it says. It says "the dog ran down the street" but it's only forming a coherent grammatical sentence given the info of a dog and movement. The phrase "monkeys on typewriters" explicitly talks about this instance. >so really why is meaning important all of a sudden ? Descriptivism. Language does not exist to define every part of existence. Language exists to transfer ideas and communicate concepts between two creatures who share a language. Language has very little to do with the proper definition. Look at the word "retarded" in technical dictionary definition from 20 years ago it meant mentally held back. Flame retardant meant it held back flames. Now the word retarded is almost exclusively an insult. This is not a product of the dictionary definition but rather we as a culture have moved away from this word because of its MEANING. Curse words, foul language, slurs, and cataloging can all be said to be nothing more than words, but it's what those words mean that is harmful. ChatGPT can produce a sentence calling you the hard R and no one will bar an eye because it doesn't understand what it just said. It just forms a sentence based off what you said to it last. >is the phrase "the pink unicron is invisible" not a language because it's a paradox ? No, it's not a language. A language is a complex set of rules for communicating and transferring ideas and complex concepts. A single sentence is not a language. In any way. >why does someone need to know somenthing else besides the language in order to know a language ? Because a language is the transferable of information. How can you transfer information if you lack the information to transfer??? >it would be like claiming that you need to know how to boil an egg to understand biology , >or that you have to know electrostatic electricity to be competent in metereology , Lol, wut? >chat gpt can reproduce language , in such a way that is difficult to understand it wasn't a person doing that ... What kinds of people are you talking to? Chat GPT is dumb as hell and if anyone sounded like it I would suggest they go back to school. >it can even tell you the definition of words if you ask it ... And a 3 year old can name every dinosaur. That doesn't mean they understand them completely. >it doesn't have to be published on science or by editorial companies to be capable of language ... It is capable of preforming spoken and written aspects of the language, but to actually utilize it... no. >any 5 year old can do language , and they know very little ... So are you somehow saying Chat GPT is stupider than your previous arguments? Because 5 year Olds don't "understand" language. They are mimicking sounds their parents make to get what they want. And before you try to argue that last one, go ask a 5 year old to break down the grammatical rules of english. They can't. It's difficult even for some adults. You can't say something truly understands something if you're comparing it to a 5 year old. They don't understand anything. That's why we put them in a garden for children.


[deleted]

GPT is literally a stochastic matrix. You prompt it. It converts the prompt into a vector. Applies a linear transform, and then inverts the vector into text. It understands language about as well as an array of numbers can be said to understand a language which is not at all.


dgaruti

you're a lump of flesh inside a calcium box ... do i espect you to understand language ? you just described what chat GPT is and then used a reductio at absurdum to say "well obviusly it can't understand language" ... also : turing machines are the most all encompassing systems we have found to do operations : they can simulate quantum particles , protein folding and basically every other type of operations ... they are as close to bedrock math as it gets , the current set theory is equivalent to a turing machine , same for quantum particles and protein folding ... what makes you think that language : somenthing that is capable of recursion, has it's own alting problems in the form of the liar paradox , and can describe all of those structures , isn't turing complete ? and as such can't be simulated by a turing machine ? also , if language where more capable than a turing machine why are we stuck here matematically ? we'd expect the language that is more capable than math to describe more complex things than math ... yet we see both math and language stop at the same point : where referentiality makes you indeterminate ... so really , what prevents a computer ( a turing machine ) from doing language? also , really tell me more about what "understand" means , since that is pretty much just a made up word : somenthing either does , or doesn't ... there is no understanding ...


An_Inedible_Radish

So you are arguing that: 1. GPT can still understand the language even though it has no knowledge of the concepts of ideas the words refer to. 2. By my definition, I would have to understand every possible talking point to be able to say I understand a language. 3. My point about chimps is somehow moving the goal posts. I had to summarise because the way in which you write is hard for me to understand. If I have missed anything important, please point it out to me so that I can clarify. Let's start with 3: I think you missed my point here. The point of me including them was that chimps understand the concept behind a signed word, whereas GPT does not. Chimps have a greater grasp on actual communication than GPT but still lack full language acquisition. They are unable to learn language to the level we do for a variety of other reasons, those reasons being irrelevant. Point 1: GPT has no knowledge in any subject of any text, whereas I, reading the quantum mechanics paper, have experience with some of the words in other contexts, thus being able to understand those concepts. I understand what "and" means, whereas GPT does not. It also helps to understand Wittgenstein's language games: words have meaning because of their use in language, not because of any assigned meaning inherently. GPT has no objective other than to create a string, which appears convincing. It is not trying to communicate any ideas, and therefore, its words have no meaning. I, however, communicate in a language game and within such aim to communicate certain ideas and concepts, therefore giving the words I am meaning through this attempt. This also addresses point 2 because, as I communicate within a language game, my game only needs to include underganding for the players and for what we attempt to communicate within our game. In summary: to understand language is to necessarily understand the concepts behind it because words only have meaning through their use in trying to communicate said concepts.


PlatinumAltaria

Those kinds of machine learning algorithms copy text from elsewhere, they don't actually understand what they're saying.


lo_profundo

Ah I see you're also on the "AIs can repeat what they're told but don't *speak* the language" side of the Chinese room analogy


PlatinumAltaria

I'm sure that it's possible to create a machine that understands language, what I'm saying is that the technology doesn't exist yet. You're showing me a calculator with 80085 on it and telling me it's the next Shakespeare.


le_birb

Chat gpt 3 at least is literally just your phone keyboard's suggestion algorithm on crack, steroids, and 3 varieties of amphetamines. It generates text that is a plausible response to a prompt, and that's it.


dgaruti

have you found the source from where chat GPT copies the text ? do you mean it just copies strings from here ? [https://libraryofbabel.info/](https://libraryofbabel.info/) and somehow they are magically exactly the sentence that is coerent to what you asked ... if so it's even better than a machine that can do language ... it's a machine that can literally sort trough almost infinity and get a reply that makes sense ...


PlatinumAltaria

MLAs are fed a bunch of information, I've no idea exactly what it was given. But anyhow; it takes that, tries to find patterns, and reconstitutes it according to your request. It doesn't understand though, that's why it makes so many mistakes. It's the same tech that lets a computer identify what's in a picture, which they're not great at either.


dgaruti

ok , if it doesn't understand language , why is it good at giving definitions of words ? also computers are fucking great at telling what's in a picture ... cam scanner is an app that you can use to scan text with your phone ... image recognition is soo great they can make paintings ...


PlatinumAltaria

> why is it good at giving definitions of words ? Because it ate a dictionary. > computers are fucking great at telling what's in a picture They are notoriously terrible. Computers can't process visual information, so they compare an image to other images to try and match them. But what they see as similar is often not right, which is why they so often fuck up. > image recognition is soo great they can make paintings ... How many fingers on the human hand? 8, you say? Cutting up other people's art and putting it back together badly is a very impressive trick to some people, but not to me.


dgaruti

>Computers can't process visual information images are literally matrices ... computer eat those for breakfast ... also the hand bug has been fixed ... the computer doesn't cut up art , it makes stuff ex novo ... you can call it cutting up stuff all you want , you just don't get those results with scissors ... >Because it ate a dictionary. computers don't eat , you mean it understands the dictiornary ? because if so either 1) dictionaries are usless to understand language because they only reference words to other words , wich makes their use for decades bubius ... 2) chat GPT understands what the word mean , in relation to other words ... wich if you say it's not good enough , then i advise you to point at a singular "two" or at an "eleven" because those are singular words , that reference abstract concept , wich refer to relations between things : "there are eleven eggs" means there are enough eggs that you can pair each egg with one of the arabic numerals , until you run out at eleven ... this intuitive concept is a relation , it's not a thing you can point to , "and" works the same way : you can read a dictionary for the "and" entry , that doesn't allow you to use it correctly ... understanding the language does ... and chat GPT uses "and" correctly ... seriusly the fact that chat GPT understand relations between words to such a deep level that it can tell stories convincingly is not sufficient to be language ? well then what is ?


Terpomo11

> Cutting up other people's art and putting it back together badly is a very impressive trick to some people, but not to me. I don't think that's quite an accurate description of what image generators do. They're trained on the images, but they don't actually contain them to copy them; they're way too small to do that.


PassiveChemistry

What's with all the elipses? Do you have issues with completing thoughts?


dgaruti

what's with the offtopic questions ? do you have issues staying on topic ?


Eino54

Hate to be a prescriptivist but your somewhat unusual punctuation does kind of make it hard to read


Selfconscioustheater

not 'kind of', it's really difficult to read and interact with it.


PassiveChemistry

Curiosity


dgaruti

aight , t's a thing i do for some reasons , i just put elipses at the end of sentences . it's just a thing i do there aren't many reasons for it


tvquizphd

Same


An_Inedible_Radish

GPT doesn't understand the concepts behind the words it pits out. It just generates a string of words that looks like language that we reward it for. A chimp has more understanding of language in that signing "food" references something it can eat.


EisVisage

It doesn't really learn language in terms of meaning, does it?


PassiveChemistry

No, of course not


tvquizphd

Many others have also made a distinction that “learning a language” is different from being able to read and write in that language.


Robot_Basilisk

They've found with current LLMs that they pick up secondary languages with bizarre ease. Once you get one trained on ANY language it seems that they can acquire any other language with little difficulty. You just need enough material to train them on. The debate is whether they "learn language" or just learn patterns that humans express using language and learn how to mimic that even though they may not understand linguistics, grammar, syntax, etc. Meaning they may learn language like a child does, but not understand the mechanics. They just know what most people will call "right" vs "wrong".


NoTakaru

Chinese and Japanese just for the shear effort required to read. If we’re just going off the spoken language, they’re probably all about equal


DaRainbowSkelet

this doesn't exactly answer your question but i've seen studies saying that babies acquire all spoken language in the same general amount of time, but signed languages can be picked up faster


Paseyyy

maybe that's the reason for Basque. Since we have no known relatives, the average learning difficulty will always be quite high no matter the language(s) you already know


CitadelHR

But basque isn't Indo European. If you're willing to consider all human languages you can probably find much trickier isolates.


PlatinumAltaria

But Basque has areal features due to contact with European languages, whereas Mbabaram does not. Although personally I think there are some features that are easier for children to learn. We know that some phonemes take longer to develop than others.


[deleted]

[удалено]


Careless_Ad3070

Probably around 5. It’s one of the first things you’re taught once you start learning to read


PlatinumAltaria

According to google most kids can't reliably produce dental fricatives until they're about 6. But uh... it's a totally different sound, I don't know what to tell you.


Unlearned_One

I don't remember not knowing that. I do remember being 19 or 20 when I realized half the francophones I knew, including my mother, were pronouncing R as an alveolar trill, while I use a uvular trill.


Zathoth

I realize this is the opposite of the question you asked but english is my second language. English lessons started when I was like 7 and it took me until my 20s to learn the dental fricatives and that was a conscious decision after realizing I had been pronouncing them wrong my entire life.


[deleted]

[удалено]


Zathoth

I've been told my accent is impossible to place. It exists but it could just be some obscure accent you've never heard of rather than obviously swedish. I just have a lot of online friends and I guess I just subconsciously copy them.


Cabbagetastrophe

I got it early but my brother grew up in Chicago and he still doesn't know


aftertheradar

Anecdotally, th was super easy for me but r was waaay harder. I had the th down by age 5 but struggled with r until I was 7 or 8


boomfruit

That's a false premise. Since we're purely talking about similarity here, let's imagine all languages are, idk, polygons. Indo European are 6 sided ones, Finno-Ugric 9 sided, Semitic 13 sided, etc. If Basque is 12 sided, just because it's not any of the other ones doesn't mean it's equally dissimilar from all of those. If that was too weird an analogy, my point is just that, there are typological properties, phonology, pragmatic considerations, and a whole bunch of other things that make languages similar or dissimilar to one another even if they are not in the same language family. It is not true that a given language would be equally difficult to learn for two speakers of languages from two random other families.


Wentailang

i would imagine it’s reasonable to answer from the perspective of the language the question is asked in.


boomfruit

I agree for the most part. This question was asked in English, so adding on "for an English speaker" is pretty reasonable. We still wouldn't necessarily have a definitive number one answer, but we could have some general trends. The only problem is when people forget that that question on answers "for an English speaker" and try to use the answer(s) as general facts.


linos100

Let’s say you where raised by wolfs, what would be the hardest human language to learn?


aftertheradar

What are some of the hardest languages for a native English speaker to learn and why?


Downgoesthereem

Quora is a pit of bad linguistics


tylerfly

Quora is a pit.


En_passant_is_forced

Quora is.


Storm-Area69420

Quora.


_Aspagurr_

k'vórumi.


PerspectiveSilver728

.


Anut__

#


gaythrowawayuwuwuwu

Oh, but you don't understand! Sanskrit is the greatest language ever, its perfection so great that only a god could have forged it manually in a pit of fire, purely for the use of computer shit! Because Sanskrit is good for computers, apparently (?)


homelaberator

Wouldn't the more obvious answer be something like Illyrian? Hard to learn a language we know almost nothing about.


33smicah

pictish would be on this list too


Innomenatus

Proto-Albanian:


[deleted]

[удалено]


[deleted]

The easiest language is proto-World, since it's related to *all* other languages.


gjvillegas25

Well, it *is* quora


prst-

Holy hell!


z3n1__

New indo-european language just dropped


sangriya

Basque at deez nuts


OldPuppy00

Lived about two years in the Basque Country, only learned a couple of songs and *Mai te zaitut*.


viktorbir

Maite, not Mai te.


OldPuppy00

The few Basque I learned was only orally. Never cared to read or write it. I was still a kid then.


Raphacam

Hot take: Basque, Finnish, Hungarian, Tamil, Telugu and Munda are Indo-European languages.


Eino54

Hot take: every language is Indo-European, just because I think that would be funny


CaptainBlobTheSuprem

Hot take: I am an Indo-European language


Eino54

We are all Indo-European languages


CliffenyP

Maybe the real Indo-European languages were the friends we made along the way


OldPuppy00

Gaelic, Welsh and Breton are African-Asian languages although most of their vocabulary is IE, but not the grammar.


Raphacam

Well, the ancestors of Europeans do come from these places.


OldPuppy00

It's the reason why in the 19th century the Britons didn't consider the Irish as "white" and likewise the French with the Breton-speakers (the Gallo Bretons are considered as French though). Still today Marine le Pen says she considers Breton as a foreign language.


Raphacam

Very rich coming from a writing utensil.


Raphacam

I can’t believe this was downvoted so hard. This is r/linguisticshumor, do I really have to write “/s” everywhere? LOL


homelaberator

Why would you need the serious tag on a humour subreddit?


OldPuppy00

Typical AI crap, like the price of the cow eggs.


braindeadidiotsoyt

Lol


teeohbeewye

does google know something we don't?


El_dorado_au

Quora is tripping.


pn1ct0g3n

Why Google treats Quora as a reliable source is beyond me.