Learning ANY new Language

Although I’ve always been interested in software languages, but I admit I don’t learn them quickly. I just spent several hours trying to create Spock tests of Jenkins scripts. And basically I realized how poorly I am able to go from specification to actually implementing what I want. My question is, do we learn computer languages any differently than spoken ones? Which is to say, do we mimic (for years) then perhaps take some classes, then perhaps learn how language is actually structured?

Or, is this a group that can evaluate the specification and immediately begin writing? Let me ask it this way, how can pick up a language without hunting down a similar example? Cheers.

There was some research on computer vs. human language processing recently, see Study Finds Brain Activity of Coders Isn't Like Language or Math - Slashdot

In my experience, the key to easily understand (new) computer languages is to understand the underlying concepts. As an example, once you know how a operation call works, it doesn’t matter if you’re calling a method, function, procedure, or whatever it’s called in any specific language. They might differ in details, but these details are rather easy to apply based on the solid conceptual understanding.

2 Likes

The following book might be interesting in this context: Manning | The Programmer's Brain

2 Likes

Just speaking from my own experience of having to learn a dozen computer languages, the two kinds of languages are entirely different. Firstly, human languages have hundreds of thousands of names of things, that requires pure memorization with no semantics. Then there are a few hundred adjectives, and just a few hundred verbs. So learning a human language is a mix of learning the verb forms to agree with gender, quantity of people (most common verbs are about the body), and brute memorization.

Hungarian is well known to be the hardest European language because almost all verbs are irregular. Chinese is well known to be the hardest language of all, because there is no logic the glyphs, of which there are over 50,000 and the pronunciation (and tone!) of each character is not embedded in the shape.

Computer languages can be learned in days typically. They often only have 100 words total. It is all about combining the words into valid sentences which do something: declare a piece of memory by giving it a name, and then using the few verbs to affect these variables. Where memorization comes into play is learning the API libraries of the OS, and the libraries of the language. A language like Java has thousands of API’s; so does Apple’s OSX.

So the libraries take more work to learn than the language itself, but one is not productive in a language unless one knows those libraries. So learning Java takes a few days, but many months to learn which of the 10 string libraries to use… (you can tell I am not a fan of Java).

There have been a few languages like Visual Basic, Delphi, that have come along and let people program without learning giant libraries. These were highly productive environments. We are now in an era worshipping complexity, and the development stack of the most commonly used environment, the web browser is the most complex stack ever commonly used; one programs in an unholy mix of HTML, JS, CSS, various frameworks, and perhaps some database tools as well. It’s really quite embarrassing how bad it is.

3 Likes

Well as others said I don’t think this has much to do with the “language” per see i.e. the basic syntax and semantics, and it has a lot to do with the mindset that is underlying the language, and the vocabulary and ecosystem that arise from it.
For example, in C the implicit mindset is that the world is made of memory locations with a definite size in bytes, and all problems are encoded as mutations of the data in those locations. In Haskell, the mindset is that the world is made of immutable values and all problems are encoded as types that describe those values and functions that describe transformations between them. In Lisp, the world is made of syntactic trees constructed from cons cells and symbols, and all computation is encoded as functions that directly or indirectly operate on those trees. And so on. So, the very same problem will be solved in very different ways by experienced C, Haskell, or Lisp hackers, and all of them are “right”.
Learning the rules of the language and a basic vocabulary might be easy, but becoming a “native speaker” so to say is hard, as it is with very different natural languages such as Italian and Chinese.

4 Likes

Thank you. I’ve ordered the book (and read the article, which I had only glanced at)

I’m must admit I am not a fan of all JS web development, where JS libraries emit html/css, but I myself build a web application with basically an isomorphic XML library by Orbeon. And my colleagues mostly disliked it because, IMHO the learning curve and generally difficult readability of large XML files. But is it good to obscure some core component/system (e.g. html by JavaScript)? Well, apparently so.