85

I often hear that a real programmer can easily learn any language within a week. Languages are just tools for getting things done, I'm told. Programming is the ultimate skill that must be learned and mastered.

How can I make sure that I'm actually learning how to program rather than simply learning the details of a language? And how can I develop programming skills that can be applied towards all languages instead of just one?

samthebrand
  • 368
  • 2
  • 12
  • 27
Ryan
  • 1,261
  • 1
  • 13
  • 14
  • @JimmyHoffa, why learning Prolog would take more than a week for someone already familiar with first-order logic and, say, Hindley-Milner type system? Agda does not introduce significantly new concepts for those who already seen ACL2 or HOL or Coq. It is sufficient to understand a small number of core concepts to be able to learn any of the existing languages in less than a week. – SK-logic Oct 10 '13 at 17:13
  • 26
    Try to learn another language. Try to solve problems that you already know how to solve in your first language in your new language. It won't be easy at the beginning. But you'll know you're learning once re-solving old problems in a new way becomes noticeably easier (note: this may take a bit of time). – FrustratedWithFormsDesigner Oct 10 '13 at 17:14
  • 43
    Also, people who claim to be able to learn a language in a week need to define what they mean when they say "Learn". `"What do you mean you're not an expert in LanguageX?!? I can learn a language in a Week!"`. 1 week later: `"See, I've learnt the language, and here's a Hello World example I copied from Wikipedia to prove it!"` – JohnL Oct 10 '13 at 17:17
  • @JimmyHoffa: I'm not sure I agree that this is a question about educational advice (which language should I learn, which class should I take). It's more about the thinking process of becoming a programmer. – Robert Harvey Oct 10 '13 at 17:23
  • @RobertHarvey I'm not certain, either way I'm leaving the CV for others to make their decision themselves but I'll remove the comment – Jimmy Hoffa Oct 10 '13 at 17:24
  • 2
    @JimmyHoffa I suspect the OP heard a version of "once you've learned a few programming languages, new ones are easy to pick up" and misunderstood... – Izkata Oct 10 '13 at 17:31
  • 3
    I'd like to see *anyone* learn to be competent in [Befunge](http://en.wikipedia.org/wiki/Befunge) in one week... – Bobson Oct 10 '13 at 17:55
  • 10
    One question needs to be asked. Do you construct your logic in syntax or do you use a faster and more efficient mental model? I find that novice programmers tend to think using syntax. – ChaosPandion Oct 10 '13 at 17:57
  • 5
    Real programmers don't learn languages - they use them. – Charles D Pantoga Oct 10 '13 at 21:29
  • 3
    It takes ~10,000 hours to learn anything worth learning. – Paul R Oct 10 '13 at 21:40
  • 11
    @PaulR: It didn't take me 10,000 hours to learn how to ride a bike. Or swim, for that matter. – Robert Harvey Oct 10 '13 at 21:43
  • 5
    Oh well, there goes that theory then... – Paul R Oct 10 '13 at 21:44
  • 2
    @Bobson, that's more due to Befunge syntax rather than the underlying concepts. – Wayne Werner Oct 10 '13 at 23:01
  • 1
    I always see the difference as... Programmers (and developers in general) are good at problem solving. Learning a programming language is not problem solving, it is learning how to use these tools and let a computer perform your solution. – Jeroen Landheer Oct 11 '13 at 00:47
  • I believe you might get more answers and possibly insight if you hold off on selecting an answer for a day or two. – ChaosPandion Oct 11 '13 at 01:06
  • 6
    I bet to be a _master_ of biking (grand prix) or a marathon swimmer would take 10,000+... – Michael Durrant Oct 11 '13 at 01:27
  • Consider taking this online class in coursera: https://www.coursera.org/course/proglang. Upon completion, I am confident you will be able to learn most languages quickly. It is running now and the first homework is due in less than a week! – Hery Oct 11 '13 at 04:34
  • My strategy has always been to focus on pure skills rather than specific skills. – Morg. Oct 11 '13 at 06:16
  • 8
    @PaulR The saying is about it taking 10k hours to _master_ a skill, not to merely "learn" it – Tobias Kienzler Oct 11 '13 at 10:10
  • 1
    @Tobias: yes, it depends on what you mean by "learn" - personally I don't consider I've learnt something fully until I really have mastered it, but others may say they have learned something when all they really have mastered are the basics. – Paul R Oct 11 '13 at 10:32
  • 1
    Programming is different than coding. – MirroredFate Oct 11 '13 at 16:25
  • 2
    @PaulR Indeed, "learn" is a rather ambiguous term - some already call something learned when it's merely been "[Pavloved](https://en.wikipedia.org/wiki/Classical_conditioning)" – Tobias Kienzler Oct 14 '13 at 07:47
  • @Robert Harvey can you drop 10 meters on a bike or swim out a hurricane? Learn a programming language quickly in 5 years. – Vorac Nov 07 '17 at 22:04

11 Answers11

96

Don't worry about meeting some ridiculous concept of "skill" so commonly heard in such statements like:

  • All programming languages are basically the same.
  • Once you pick up one language well you can pick up any other language quickly and easily.
  • Languages are just tools, there's some overarching brain-magic that actually makes the software.

These statements are all based on a flawed premise and betray a lack of experience across a broader spectrum of programming languages. They are very common statements and strongly believed by a great swath of programmers, I won't dispute that, but I will dispute their accuracy.

This is proved simply: Spend one week (or really any amount of time greater than a couple days) trying to learn the fundamentals of Haskell, Prolog, or Agda. You will soon after start hearing the old Sesame Street song play in your head "One of these things is not like the others...".

As it turns out, there is a whole swath of programming languages, techniques, and approaches which are so foreign from what 95% of us do or have ever done. Many are completely unaware that any of these other concepts even exist, which is fine and these concepts aren't necessary to be an employed and even effective programmer.

But the fact remains: These techniques and approaches do exist, they are good for many different things and can be very useful, but they are not just like what you're used to and people cannot simply pick them up with an afternoon of fiddling.

Furthermore, I would say the majority of cases where people claim they have or can learn such complex things as programming languages so quickly as a week, they are suffering from a bit of Dunning Kruger Effect, Wikipedia (emphasis mine):

The Dunning–Kruger effect is a cognitive bias in which unskilled individuals suffer from illusory superiority, mistakenly rating their ability much higher than average. This bias is attributed to a metacognitive inability of the unskilled to recognize their mistakes.

I would refer people to this more experienced perview on the concept of learning to program by Peter Norvig: Learn to program in ten years.

Researchers (Bloom (1985), Bryan & Harter (1899), Hayes (1989), Simmon & Chase (1973)) have shown it takes about ten years to develop expertise in any of a wide variety of areas, including chess playing, music composition, telegraph operation, painting, piano playing, swimming, tennis, and research in neuropsychology and topology. The key is deliberative practice: not just doing it again and again, but challenging yourself with a task that is just beyond your current ability, trying it, analyzing your performance while and after doing it, and correcting any mistakes. Then repeat. And repeat again.


Surely, there is a set of overarching principles that will make all languages easy to learn!

Perhaps, but I would argue this set of principles is so large that there will almost always be languages outside of your one-week reach. As you add new concepts to the list you're familiar and comfortable with, this list of languages outside your immediate reach may shrink, but I have a hard time believing it will ever go away. The list of conceptual computing approaches to things is so broad it's baffling, from concatenative languages to vector based languages to languages specializing in AI or metaprogramming (or languages which exist entirely to support regular expressions).

After ten years you will be able to generally program. This means you can write somewhat decent code in some language or style of languages. So after 10 years you are ready to start tackling these countless broad cross-cutting concepts for the rest of your life, and short of being Edsger W. Dijkstra, Donald Knuth or John D. Carmack, you're not going to get to all of them.

Peter Mortensen
  • 1,050
  • 2
  • 12
  • 14
Jimmy Hoffa
  • 16,039
  • 3
  • 69
  • 80
  • 12
    Enh. There's a difference between "knowing" a language, and being proficient enough to discover and fix a minor bug in one. A good programmer can do the latter pretty quickly, even in archaic languages. – Telastyn Oct 10 '13 at 17:57
  • @CharlesE.Grant thanks for the edit, I didn't look and was recalling it from memory *oops*... Petzold wrote that famous "does visual studio make us dumber?" article and for some reason I got him mixed up with Norvig (probably the plain white page with a wall of text website is why I associated them together..) – Jimmy Hoffa Oct 10 '13 at 17:57
  • 1
    @RobertHarvey I'm just disavowing the general concept that languages are just a tool and all alike, I never said you couldn't get up and moving quickly in general programming in many cases, but the idea that "Oh I can just go learn any programming language because they're all exactly like the one(s) I know" is just an inaccurate perception. – Jimmy Hoffa Oct 10 '13 at 17:59
  • I'd have to semi-disagree with you. Of course a beginner is not going to able to pick up a 2nd language in 10% of the time they learned their first language. But somebody coming out of a decent CS program or equivalent education should have been introduced to multiple language paradigms. Can they write a greenfield program from scratch in a new language after a week's study? No, but they should be able to start fixing bugs in an existing code base. – Charles E. Grant Oct 10 '13 at 17:59
  • 1
    @Telastyn True, but usually when people make the statements like the ones I mentioned at the intro they don't mean "I could fiddle with it" they tend to mean "It's just like using a different philips head screw driver, if you can use one you are equally proficient with all of them". Even still, the gap between 10 years of industry OO experience and being able to fix your first bug in a Haskell program is still significantly more than a week I'd say. – Jimmy Hoffa Oct 10 '13 at 18:00
  • 5
    @CharlesE.Grant I think you overestimate what the majority learn in college, also how long it takes to become even semi-proficient in a language like Haskell or Prolog. I would argue a skilled industry experienced engineer with no functional programming experience would take significantly more than a week to be able to fix his first bug in a Haskell program. – Jimmy Hoffa Oct 10 '13 at 18:03
  • 11
    I'd still argue that the very fundamental set of concepts is pretty compact. Once you understand *term rewriting*, you have a tool for defining lambda calculus, SK calculus, Turing machine, Markov algorithm, etc. A small number of truly fundamental ideas can cover most of the computer science. But, of course, experience is required to be able to see the simple patterns in seemingly complex things. – SK-logic Oct 10 '13 at 18:32
  • 1
    @SK-logic I +1 your comment because frankly **I'm not sure**. The fact that there are things as disparate as Blub and fractran, or even in similar languages: Going from SML to fixing a bug caused by improper universal quantification in Haskell betrays many languages are more fundamentally divergent than they appear. At the same time I don't disagree with you that there are some fundamental concepts like term rewriting that marry many many things which would otherwise appear divergent. I'm not convinced it's a compact list, but +1 because people should read your comment and think about it. – Jimmy Hoffa Oct 10 '13 at 19:44
  • 5
    I'd say it's not so much the Dunning–Kruger effect, as simply assuming that "programming language" = "c-style programming language". After knowing a decent amount of c++, a decent amount of C#, and some smatterings of perl and python, I expect I could become reasonably fluent in Java, PHP, etc. in a week. Not necessarily expert, but at least fairly fluent. I *did* pick up javascript in a few days. At that point, it's mainly about learning the differences between them. Note: most popular real-world language *are* c-like. The same would not necessarily be true of, say, Prolog. – neminem Oct 10 '13 at 23:00
  • 1
    I'd argue that the reason it's more difficult to pick up Prolog or Haskell has to do with syntax than simply programming in a different style (e.g. functional v. OOP, or whatever) – Wayne Werner Oct 10 '13 at 23:00
  • 2
    @WayneWerner Seriously, the difference between Haskell or Prolog and algol languages is *so much more* than syntax, you are just perpetuating the myth. Take my test above: Spend a week trying to learn Haskell and see how that works for you. Honestly, it'll be good for you, lots to learn from doing so. – Jimmy Hoffa Oct 10 '13 at 23:09
  • @neminem I agree though I bring it up because it's *related*: They're both based on making spurious assumptions due to lack of knowledge coupled with lack of knowing you're lacking the knowledge. It's not that people are stupid, far from it, they just don't know that they don't know about all those non-algol languages. – Jimmy Hoffa Oct 10 '13 at 23:15
  • 1
    +10 (well, if I could) As to C-style languages, last week I fixed a bug in a C# program, but I've never studied C# at all. I also listened to the Catalog Aria from Don Giovanni a few times and now I know some Italian. But to say I "picked up" either language is equally absurd; sure, there's an extent that skill in English can be applied to Italian, and C-style languages superficially are a lot alike. A good debugger makes fixing bugs in a language you don't know, at times, trivial. Regardless, counting to 10 in Chinese doesn't make me fluent, and fixing a bug in C# doesn't mean I know C# – BrianH Oct 11 '13 at 01:08
  • 1
    I partially agree with you. Yes, if you only had experience on imperative/OO languages, you will not be able to pick up Haskell or Prolog in one week. But this isn't a fair comparison. A new paradigm is a new set of thoughts, it's not simply learning a new language. However if you're very familiar with Haskell, you will be able to program in Erlang or Scala in one week. Of course you will not be proficient, but you're certainly getting your way around. – Bernardo Pires Oct 15 '13 at 19:46
  • Excellent answer. The first time I had a "these things are not like the others" moment was learning HLSL for the first time. Talk about reaching outside my programmer comfort zone! That being said, OP, programming is as much a task as it is an educational marathon. You won't ever truly know everything...but the pursuit of and proficiency with finding that knowledge is what separates the true programmers from the Dunning-Kruger-ers. Just know that you have asked an excellent question and one worth revisiting often, which tells me you are already farther along than you might think. – Mike H. Dec 09 '13 at 13:59
52

... how can I develop programming skills that can be applied towards all languages instead of just one?

The key to this question is to transcend the language and think in not the language you are coding in.

WAT?

Experienced polyglot programmers think in the abstract syntax tree (AST) of their own mental model of the language. One doesn't think "I need a for loop here", but rather "I need to loop over something" and translates to that to the appropriate for, or while, or iterator or recursion for that language.

This is similar to what one sees in learning a spoken language. People who speak many languages fluently think the meaning, and it comes out in a given language.

One can see some clue of this AST in the pair of eyetracking videos Code Comprehension with Eye Tracking and Eye-Tracking Code Experiment (Novice) where the movements of the eye of a begineer and experienced programmer are watched. One can see the experienced programer 'compile' the code into their mental model and 'run' it in their head, while the beginner has to iterate over the code keyword by keyword.

Thus, the key to the question of developing programming skills to apply to all languages is to learn multiple languages so that one can distance themselves from having the mental model of one language and develop the ability to generate the AST for a problem on their own in a head language that is then translated to a given language.

Once one has this ability to use the AST in the head, learning another language within a similar school of thought (going to Befunge is a bit of a jump from Java, but not as much from Forth) becomes much easier - it's 'just' translating the AST to a new language which is much easier the 3rd, 4th and 5th (etc...) time it's done.


There is a classic article, Real Programmers Don't Use Pascal. Part of this reads:

... the determined Real Programmer can write Fortran programs in any language

There are also bits for which you can't just use the mental AST - you need to think in the language too. This takes a bit of time to accomplish (I'm still accused of writing Perl code in Python and my first Lisp code was reviewed saying "This is a very good C program.").

To this, I must point out an article published by the ACM, How Not to Write Fortran in Any Language. The third paragraph of the article (that isn't leading quotes) directly addresses the question at hand:

There are characteristics of good coding that transcend all general-purpose programming languages. You can implement good design and transparent style in almost any code, if you apply yourself to it. Just because a programming language allows you to write bad code doesn’t mean that you have to do it. And a programming language that has been engineered to promote good style and design can still be used to write terrible code if the coder is sufficiently creative. You can drown in a bathtub with an inch of water in it, and you can easily write a completely unreadable and unmaintainable program in a language with no gotos or line numbers, with exception handling and generic types and garbage collection. Whether you're writing Fortran or Java, C++ or Smalltalk, you can (and should) choose to write good code instead of bad code.

It isn't just enough to have the AST - it's necessary to have the AST that one can translate into other languages. Having a Fortran AST in your head and writing Fortran code in Java isn't a good thing. One must also be familiar enough with the language and its idioms to be able to think in the language (despite what I said at the very top).

I've seen Java code written by someone who hadn't stopped writing C code. There was one object with a main method. In this object were a bunch of static methods called by main, and private inner classes that had public fields (and thus looked a lot like struts). It was C code written in Java. All that was done was translating the syntax of one language to another.

To get past this point, one needs to continue to write code in multiple languages, not think in those languages when designing the code, but think in them when translating the design into the code to work with the language idioms correctly.

The only way to get there - being able to develop programming skills that can be applied to all languages - is to continue to learn languages and keep that mental programing language flexible rather than linked to one language.

(My apologies to ChaosPandion for borrowing heavily from the idea he presented.)

  • 3
    No need to apologize. I think you've written an impressive answer. – ChaosPandion Oct 10 '13 at 21:21
  • I wanted to credit the person that got me thinking of it in that direction to write the answer. –  Oct 10 '13 at 21:28
  • 3
    This is a *very* good answer. Wish I could upvote twice. – Wayne Werner Oct 10 '13 at 23:04
  • 2
    Actually this is exactly why you shouldn't learn OO first, as this formats your brain with one of the worst AST imaginable. – Morg. Oct 11 '13 at 06:29
  • @Morg. this is why people have said in the past that Basic / Fortran / Assembly teaches bad habits. This is also why (one theory goes) that MIT used to start out the CS program with LISP - to completely invalidate any AST that a student already had when entering the program (and make them more receptive to learning the basics again?). –  Oct 11 '13 at 14:06
  • I may be teaching someone to program real soon and I'm thinking of providing examples in various languages using various techniques. It will be interesting to see how that affects their development. – ChaosPandion Oct 11 '13 at 14:13
  • @ChaosPandion it will probably confuse them because they won't be able to get a full picture of any one thing with that approach. Pack that idea in and go with SML; It's the simplest syntax language of any so you won't get bogged down in syntax (basically looks like math which they'll be familiar with and has *very* few language-level keywords/constructs etc) as well as allowing you to teach a clean abstract approach to problem decomposition. – Jimmy Hoffa Oct 11 '13 at 14:53
  • 1
    @JimmyHoffa - You might be right. I've always initially taught using one language and slowly introducing more later. Still I do think it is worth exploring as I can always hit the brakes and have them focus on one language. (SML seems like a pretty good choice actually.) – ChaosPandion Oct 11 '13 at 15:03
12

Pick a language, and start coding. Python is a good choice for a beginner, and there are tutorials available online, so that you can learn how to do it properly.

Everything follows from that. Your interests will lead you to frameworks and design concepts that will add sophistication to your programs. You will discover that there are online courses you can take that will ground you in the fundamentals and the theory, and that there are different programming paradigms you can explore, and so on.

And yes, you will discover languages like Haskell that will teach you something new, once you have a firm grounding in the fundamentals.

Some programmers probably think all languages are the same because they haven't been exposed to any that make them think differently. All of the most commonly used languages are derived from Algol (they are essentially procedural languages), and of those, most are curly-brace languages similar to C. All of them do essentially the same things, albeit some with more sophistication than others.

Robert Harvey
  • 198,589
  • 55
  • 464
  • 673
5

Programming is about problems solving in such a way that the solution can be expressed in such a restricted grammar that it can be implemented with a programming language. The art of programming is therefore the art of solving problems.

Certain languages invite other programming paradigms such as object orientation, event driven, multi-threaded and MVC framework based. These are all just models and patterns and have nothing really to do with implementation.

If you can sit and solve a problem on paper in such a way that it could be easily translated into code and is associated with an appropriate model for your platform, then you are a programmer. If all you can do it take those solutions and implement them in our chosen language, then that's another matter.

I have been programming for 30 years (OMFG!) and still use php.net to look up commands in PHP because it's not my first language.

I would say that expertise in languages is inversely proportional to how often you look at the manual or stackoverflow. Expertise in programming is how readily you solve problems in a way which is compatible with computer programming languages.

In related news, I learnt Ruby last week. Though I'm no "expert", I can solve you a problem which I could write in Perl, say, and then spend an age translating it to Ruby whilst I learn it some more.

stevemarvell
  • 249
  • 1
  • 1
  • Your comment is the first one I read about models and patterns! I'm 100% with your comment, one thing is to get a language and start making a program. Another one is to think thru the problem and find the appropriate tools to solve it, then you start looking for a language and start programing. –  Oct 11 '13 at 13:35
3

I think, as with anything, practice makes perfect. Just don't pigeonhole yourself into always doing the same thing or always using the same language and keep continuing to learn things on every project.

I think you can easily draw a parallel to something like learning to play a guitar. Any good musician can learn to play a new song in a very short period of time, because they already know all the chords and all the theory behind why the chords are played the way they are. How do they get that good? They just have played so many songs that all the patterns have just blended together, while at the same time supplemented their knowledge with actual documented theory that those patterns subscribe too.

So maybe you can play a few songs very well, but you can't deviate or pick up new songs quickly. This is probably the equivalent of a .NET programmer that continues to make the same CRUD application over and over, at some point try something new, add in some web service calls or an advanced UI, or writing it in a whole new language. When you hit a snag look into why things happen the way they do, ask questions on Stack Exchange, etc. Eventually, you will see all the patterns that continually come up and know some of the underlying theory and learning a new language won't seem nearly as daunting.

Peter Mortensen
  • 1,050
  • 2
  • 12
  • 14
KDiTraglia
  • 167
  • 7
1

I'm not going to adress how long it takes to learn a language or what it means to learn a language, instead I'm going to address your actual problem: how to determine if you have learned to program or have learned a programming language.

You've learned to program if you have learned to break a problem down into discrete processes and then use those processes to solve your problem. You've learned a progamming language if you've learned the syntax of a language and know how to adjust how a process works, when implemented in that language.

This is not to say you should program in Fortan when using Lisp or add up the values of a column in a table in a db using a cursor. Just that the language is an implementation detail. One that can change what processes are needed, but not the need for identifing and creating processes -- in the end there is a real world implementation, with input/output and desired results.

jmoreno
  • 10,640
  • 1
  • 31
  • 48
1

My strategy has always been to focus on pure skills rather than specific skills.

Instead of learning Python (or any language) 's special syntax for whatever it is you want to do, spend your brain cycles solving abstract problems, like how to best solve every problem in that category.

That way, you will know what to do no matter the language, and will mostly possess timeless skills that can be used for programming in any language.

Specifically avoid tools that are full of gotchas, like MySQL, or opinionated languages, like Java, as whatever you learn by using these tools will have a big proportion of tool-specific knowledge which is bound to become useless pretty fast.

Contrary to what has been said in many answers, do NOT listen to other programmers,You are a noob and there is no way you can tell the fake from the real deal, so you're better off taking everything with a spoon of salt.

You want to be questioning all the time and accepting only when the solution is fast, elegant and reliable.

Morg.
  • 250
  • 1
  • 5
  • 1
    _"do NOT listen to other programmers"_ -- yeah sure. "- How would you know if you've written readable and easily maintainable code? - Your peer tells you after reviewing the code. Rationale: You cannot determine this yourself because you know more as the author than the code says by itself. A computer cannot tell you, for the same reasons that it cannot tell if a painting is art or not. Hence, you need another human - capable of maintaining the software - to look at what you have written and give his or her opinion..." ([quote source](http://programmers.stackexchange.com/a/141010/31260)) – gnat Oct 11 '13 at 07:11
  • @gnat do whatever you like. I'm just telling you that since most programmers can't code for shit, their feedback is potentially harmful, and you should be bringing bags and bags of salt to deal with that. Also I believe "editable and readable by morons" is not a sign of quality at all. Believe what you want but don't go around -1 just because people don't agree with your vision. – Morg. Oct 11 '13 at 08:11
  • my vote indicates evaluation of post _quality_, not whether I agree or disagree (wrt agreement, I rather think you have a point here). I quoted another opinion not because it's opposite, but because it has a solid explanation (see "RATIONALE"). If you can think of similarly solid explanation to back up your opinion, consider [edit]ing the post to add it – gnat Oct 11 '13 at 08:40
  • whatever. content > form . keep your form, I'll keep my content. – Morg. Oct 11 '13 at 08:55
0

I think, if you can think analytically, you have a good start.

Learn any language you want and work yourself trough a series of examples e.g. as presented in nearly ever book that teaches programming.

Next try to solve your own problems. Try to find different solutions and compare them. Speed and memory-usage are commonly used factors that matter. Discuss your solutions with other programmers.

Read code of other programmers and try to understand why they solved the problem this way.

You should also read some books about algorithms to get an overview over standard approaches. New problems are often modifications of old problems.

A lot of practice and working with code also in teams will help you to increase your skills step by step.

I hope my opinion answers you question at least partial.

MrSmith42
  • 1,041
  • 7
  • 12
0

There's the theoretical approach. Learning about how computers actually work under the cover. How the basic processor instructions are stringed together to make the more complex operations and structures that we take for granted in high-level programming land.

Then there's the more practical programming approach. The main sticking point that plague people usually labeled as "not good programmers" is that they only really know one language. And even if they know others, they program in them in the same way they do with their native language. That's a cycle one must break if they really want to learn how to program. The default answer to that is to learn at least one language from each programming paradigm. So learn an OOP language, a functional language, a scripting language ... etc. And by learning I don't mean learning the syntax. You learn a language by actually using it to create something.

Personally, when I want to learn a new language I use Project Euler puzzlers. I go to a puzzle that I have already solved in an OOP language (as an example) and try to solve it using a functional one while trying to follow the best practices of the new language. When you solve the same problem using two fundamentally different approaches you not only see what the real differences are, but they also show you where the common areas are. These common areas that are shared by all languages is the real programming, the differences are just different ways to achieve it.

System Down
  • 4,743
  • 3
  • 24
  • 35
  • 4
    I wouldn't call learning about the physical behaviour of a computer a "theoretical approach", a "theoretical approach" would be learning the theory, reading the church-turing thesis and learning about the curry howard isomorphism, learning the lambda calculus and the basics of number theory, these are theoretical underpinnings. Not saying your answer is right or wrong, just saying I would refer to that as the concrete approach not theoretical because it lacks theory. – Jimmy Hoffa Oct 10 '13 at 18:39
  • @JimmyHoffa - Good points! – System Down Oct 10 '13 at 18:42
  • 1
    "How the basic processor instructions are stringed together (...)" seems like terrible idea for beginners (OP did not stated he is one but lets assume for sake of argument. It'd teach 'micro-optimizations' without really teaching how to optimize (3-5 stage architecture can be considered a bit outdated...). Don't get me wrong - CA's fascinating - but 'proper' appreciation would require words like 'out-of-order' and 'multiscalar', and probably come after some basic programming experience. – Maciej Piechotka Oct 10 '13 at 23:55
0

In my case, I learn how to actually program through the following:

  1. Learn from the masters. Listen to programming podcasts, read professional blogs in your programming topic of choice, read/watch wonderful tutorials done by gurus that are scattered all over the web and lastly, reading epic books like The Pragmatic Programmer. This book has a lot of programming gems that have been accumulated throughout the career of the authors. One sure fire way to learn how to actually code is to know how other successful programmers do it.
  2. Experience by doing. Reading about it and knowing is one thing, actually putting it into practice and getting it to work is another. There is no better teacher than experience, so put your coding cap on and get started.
  3. Ask someone who knows. Just like you're doing now, don't be afraid to ask about best practices or better ways to do things from seniors in your team, or if you're unfortunate enough to not have access to the said seniors or mentors or gurus, then there's still the rest of stackexchange and the internet to ask.

Also, as your commenters have mentioned, don't forget to master your tools as well. Learning all the best practices and greatest theories are all for naught or will be poorly implemented if you don't know enough about your tool, in this case, a programming language.

Andres F.
  • 5,119
  • 2
  • 29
  • 41
Maru
  • 1,402
  • 10
  • 19
0

Well, most of the things I wanted to say has already been said. What I would like to add is a very simple analogy.

If programming languages are considered mere tools, even then there is absolutely no logic in being good at one makes being good at the other a cakewalk.

Just consider a bunch of reputed master swordsmen, suddenly put down their swords and went off to battle with spears after 7 days training. What would happen? They would be massacred.

Languages are often not difficult to learn but it takes patience and exercise to be good at it. Additionally, there is no right way to learn programming.

Learning a programming is like playing an RPG game. Sometimes you use swords, sometimes spears, sometimes a shield. Each enemy you kill, you get experience points. Once you have enough experience points, you level up. Now mastering a sword will not make you excellent with bows and arrows. But a portion of the experience you gained previously will increase your stamina and speed.

Here's a couple of things you might want to do when learning a language.

  • Read about the language. if it sound interesting try out the hello world app(s) by yourself.
  • Read some tutorials, tricks, blogs.
  • Make simple apps in it just for fun.
  • Test different features.
  • If you really like it, buy some books and/or video tutorials.
  • Search for good libraries.
  • Search for answers, ask only if you can't find the answers.
  • Help others asking for answers (where better than here?)
  • Make something useful. Making a calculator app may be a good exercise but if you make a TO-DO list app and actually you use on your PC/Phone, the feeling is 100 times satisfactory.

Experience new languages, explore new libraries, learn new tricks on your free time. Before you know it you'll surprise yourself with your own skill.