In general, I've found that almost all success at anything worth doing comes from repeatedly failing until you don't. If you fail the first time and give up, you probably didn't care enough to do it anyway. If you succeed the first time, it probably wasn't that hard. Fail often and fail hard and be proud of it! It's a sign of progress, or at the very least attempted progress.
I prescribe to the "you must try, try and try, try and try" method of failure as described by Mr. Cliff. If it doesn't work out by the 5th time just repeat the cycle as necessary.
Or a sign that you should try something else. Not everyone can succeed at everything. Obviously it is best to recognize that as early as possible in the endeavor.
the best story always involves the ones where somebody WANTS to do something so bad that they just keep at it at all costs and eventually make it. Great story but the more common story is somewhere between giving up vs realizing that something isn't that valuable to you (or as fun) as you thought it might be. you realize you don't or probably won't enjoy the process. The no pain no gain model is true to an extent but often a person got a closer look, they can probably deal with the pain, but the gain is no longer all the shiny and worth pursuing.
So you must look at the objective with short and long goggles to see if your understanding about it is right. As you do things well they might get more enjoyable with time. Finding a balance where challenge is still there but you're in the zone of craft. Where you know you can make it work and the challenge is something surmountable. The memory of how hard it was to get to that point isn't one of regret or pain. Just learning, and maybe relief.
Not sure the last example the author gives of chess-playing programs is a good one though; a task where it is literally impossible to succeed would just serve to discourage even the most motivated of newcomers, I'd think.
Was easy as a kid - effortless even. If your day job doesn't involve coding then it fades fast though.
The mindset / thinking style thankfully doesn't seem to fade though. Thats why I'm all for making kids learn to code. Yes they'll suck & no they won't become real programmers but it leaves a permanent imprint on the minds of those that take it seriously.
I loved it. I bought my first domain when I was 10, intending to create a game like Neopets. I obviously quickly learned I wasn't going to be able to, but I created a fan site, kept hacking at it, and learning. I learned security, etc. By the time I was out of high school, I was recruited by Barracuda Networks.
Some of those kids will go on to be decent programmers. And it definitely leaves you with a much more analytic way of thinking, for better or worse.
Not OP, but I'd like to chime in here. Getting good with computers is often best done by spending a large amount of time on computers. For many of "us" that involved teenage years playing MMORPGs, general programming, IRC, etc.
Despite a certain illusion of sociality involved in these activities, I am of the opinion that they are uniquely anti-social, and thus were in one way or another a deprivation of my childhood.
Sure I now have a much higher earnings potential. But was it worth it? I don't know. And thus I'm in no position to recommend it for anyone else.
Theirs probably a balance in it somewhere. And you can be damned sure if I'm ever a parent I'm going to tread lightly between "my child should learn to program, go to college, etc." and "my child needs to live a happy, normal, life".
That why it's a "for better or for worse" mentality in my opinion. We don't really know which is best do we?
I think about the same thing quite a lot. I remember having anti-social tendencies from a really young age. No idea why, I was just rarely interested in other kids. So I spent a lot of time dicking around with my computer. That paid off, but I got a little better with social skills later on after working at it. I always wonder how my life would be different if I didn't have all the computer escapism and worked on social skills earlier.
This is exactly my education path. The great thing was my MMO opened up both the client and server for development so I learned to program using GScript. God awful language in hindsight but addictive. I eventually spent more time developing for the game than I did playing it. That was god awful too since you had to pay to develop content for their game!
This is very true. I was raised in Redmond, Washington (home of Microsoft and others), and the school curriculum was influenced to some degree by proximity to software companies. There where many lessons that where in some way relatable to programming, and those made learning programming later on much easier.
I also grew up in Redmond, but probably a little earlier than you. Your comment made me wonder why it took Microsoft so long to get into the schools -- from 1982 to 1990, it was all Commodore PETs and Apple IIs. In Microsofts own backyard!
You're on exactly the same wavelength as me on this...but that phrase made me pause. Remember your kids will judge whatever you give them against the latest ipad3000. If that gap is too big they'll lose interest forever & you risk losing the war because you weren't willing to concede a battle.
I grew up exactly as you described - built a dos box with my dad & it was all mine. The internet world is too complicated though - you need an adult to hold your hand.
The above is of course opinion - not fact. To each his own...
My interest in computer as a kid (4 or 5) came from watching my dad working/playing on the computer, and he would show me simple commands in DOS and little things (the paint bucket stood out for some reason) he could do with it. It really left an impression and told me that computer was a useful/fun tool that was worth exploring, which I believe eventually led to me trying to figure things out on the computer and making websites/interfaces.
I'm waiting for an article about "How I Failed, Failed, and Am Still Failing" to provide the flip side in contrast to these success stories we are so biased towards
here you go - "So I fired up my IDE and spent an hour, finally getting my C# program to compile. It put out bullshit, I couldn't figure out why. I gave up. Then I fired up my ole' Putty and tried to get into my server to figure out why my database kept giving me an error. I thought about rolling back but couldn't figure out how Chef actually works, even though I set it up. Finally, I had this really cool idea for a javascript game I could code up real quick, and spent an hour in the javascript console trying to figure out why it wouldn't execute. Gave up on that too. I read some forums, there was a cool app to scrape my facebook, I ran it, it just requires Python, but it wouldn't run. Overall I accomplished nothing. Oh, and Chrome is still crashing at the most random times, no idea why."
Holy shit you just articulated all of the reasons that hackers value their knowledge - and the very important quality of never giving up. You search. You go on IRC. You get the source and compile it. You step aside and learn a tool better in isolation - then use it on your problem. You poke, prod, test. And your understanding begins to form, to conform to the shape of the problem, the map of what you know, what you've tried, what you have yet to try. You talk to others to fill in the important blank of what you don't know yet to try.
Because that story you told is abhorrant, even profane, but all too common.
This is the stories we like to read! A lot of us are procrastinators? I will remember your reaction to a comment.
I will now get off HN, and open up my more dusty Folders!
I think there are a lot of those stories? I know I can't call myself a Programmer yet? I can call myself a website builder though. I can spot errors in other people's Javascript, but I still would never call myself a Programmer.
And even though I can put up websites, I not that proud of it? 95% of the learning process was just following directions?
I have noticed there's a lot of successful people out there whom hang titles behind their names whom aren't very good at what they do, whether it's Programming, Business, or Whatever Expertise claimed?
I sometimes think the key to success is over inflating one's ability, and faking it? I personally know so many extremely wealthy Fakers. Programming is one field I couldn't imagine faking it? I guess my point is I know too many people who literally lied about their abilities, and in the end--Won big time!
I sometimes feel the way I was brought up(every Sunday going to church, and feeling very confused) was detrimental to my wallet?? Plus, as a kid, unless we really/really were great at something; we kept our accomplishments/skills to ourselfs. We never got the chance to brag. If we did(brag), someone would literally, usually verbally, smack us, and we never acted overconfident again? I honestly feel being modest, and not chasing the Dollar--held my generation(born 60's-70's) back? (What I saying, I still can't stomach people who over-inflate their abilities, nor go after money too aggressively--especially money from the poor/middle class! The thought of my filthy Looker sister is enough to get me to stop this post. She did some damage to her family over the almighty Dollar--really weaselly stuff. She's rich though?)
I keep seeing all these programming articles and schools that'll teach you in 3 months and you'll make 70k+. Someone is trying to flood the market to drive wages down.
Notice also that now it's all about "coding" and never programming or software development/engineering. This is deliberate.
And good luck getting respect from non-technical managers, policy makers, or the public at large when your actual job title is "coder" and all of these articles, bootcamps, and CodeCademy type websites have convinced everyone that "coding" (unlike law or medicine) is super easy and anyone can do it!
"Why isn't the ERP system ready yet? It's just coding, after all... I did that on CodeCademy when I was in high school as part of the National Teach Everyone To Code Initiative and it was easy. I already did the hard part by telling you exactly what we need (and telling you again each of the fifty or so times our requirements changed); all you have to do is code out the details. What's the hold up?"
>>Someone is trying to flood the market to drive wages down.
Why resort to such a conspiracy theory? Do you have any evidence?
I subscribe to a much simpler explanation: the market is simply responding to demand for higher-paying jobs. Colleges are obviously failing to train people for such jobs, so vocational schools (i.e. for coding) are picking up the slack.
Anybody entering the market that has even a grain of intelligence would be spending their time both before and after being hired learning how to engineer software (language/framework agnostic) instead of forever monkeying around with a scripting language in a single environment. Doing anything less is eventual career and paycheck suicide. Python or Ruby are fine as starting points, but if you stop there you're fucking yourself over.
I have to wonder what sort of code your average bootcamp graduate is really capable of producing. I'd say that you could get really, really good at a single concrete stack and toolset in that time (e.g. osx, bash, Linode, postgres, rails, chrome, sublime, chrome dev tools, maybe nginx, wireshark, redis).
But as soon as you swap out even one item, the pain will start.
Now do it with Windows, power shell, Azure, mysql, php, firefox, notepad++, firebug, IIS, mongo. Now do it with Linux, zsh, Digital Ocean Meteor, Safari, vi, django, apache. Now do it with Linux, bash, AWS, Java, Spring, Oracle, Eclipse, Glassfish. Now do it with RedHat in Docker on AWS running Postgres, Redis, Nginx, Java and Node, over WebSockets over any browser.
Now you will know what you are doing. It will take about 10 years.
Of course, within ten years half of the listed products will be gone, and replaced with the new hotness. But there are definitely a lot of low-level and high-level concepts you pick up along the way, and what I tell everyone who is learning is that the MORE stuff you try, the better you will be at learning MORE stuff. Every piece you learn about, you'll find another piece later that mirrors, borrows concepts, or shares ancestry, and you'll pick it up twice as fast as the last time.
I'd go even further, that if you learn how to do it with lots of different tools in different combinations, then you get a feel for what is important and what is not, what is fashion and what is important, and what software applications are, in general. Most of all, you can begin to make the critical distinction between "application" and "infrastructure".
But this is a very difficult path, one made slippery with tears.
You think people with cs degrees deal with stack swaps better than boot campers? I went to a top 10 cs program and I don't think so. Both populations push out equally ineffectual grads, only passionate programmers can really be fluid, something that no program can instill
Partly quoting myself from another wall-of-text discussion forum where code academies were discussed.[1]
Some developers need an introduction and/or mentoring. However, once past the absolute fundamentals (e.g. basic control structures and logic), mentoring becomes more strategic (less about technical details), and improving as a coder is a matter of how adroitly one can understand more complex patterns and absorb novel formalisms (frameworks, languages, hardware, integrations, etc.)
Growing beyond the level of individual contributor is working with individuals and teams to coordinate the development of more complex software systems. These human/machine systems are very complex and highly dynamic, to the point that there are no accurate models to describe how they work.
EVERYTHING is custom and cutting edge. (I'm thinking here of some of what Paul Ford said in his "What is Code?"[2], specifically section 5.6 "Off the Shelf")
This means that the career paths of developers cannot be predicted by simply taking into account educational backgrounds. Models that try to predict developer careers paths from educational backgrounds are not even close to being wrong because they model the reality of software development so poorly.
There will be lots of room for grunt coders who start off in code academies, sure. But some of those coders will start entry level and, after much more learning, go on to help develop world-altering highly-regarded software. Some coders trained at the most elite institutions will never contribute even one code block to a useful project of any kind. (I'm exaggerating, but you get my point.)
TL;DR: Code academies are simply another means by which humans will become part of the software industry and the status and stature of developers so initiated will be manifold.
> Code academies are simply another means by which humans will become part of the software industry and the status and stature of developers so initiated will be manifold.
Or to put it another way: dev quality is independent of education. I agree with that.
The point I was making was more about what it actually takes to become a really good programmer. Against the 10 or 20 years of experience it takes, both college and a boot camp look roughly the same: approximately irrelevant (although studying classic computer science, operating systems, algorithms and data-structures, are all quite relevant)
It's the same as the "scientist shortage" except instead of taking 5 years to churn out PhDs who can't get research jobs, you can churn out "coders" in only 12 weeks!
I am doubtful that one of these boot-campers will be as good as someone with 10+ years of coding experience, so there will still be a wage premium for the most competent of coders
I am curious if project euler teaches you programming. I does teach you mathematics and thinking logically, but software engineering is different and is really learnt on the job. I am always confused when someone says about learning or teaching coding. What is meant by coding here? Writing fizzbuzz? Or writing enterprise software with so many levels in between and around as well.
> I am always confused when someone says about learning or teaching coding. What is meant by coding here? Writing fizzbuzz? Or writing enterprise software with so many levels in between and around as well.
Isn't that like asking, "What is meant by 'learning to write'? Subject-verb sentences? Or the next Great American Novel?" It's possible to learn programming without getting into software engineering.
Many aspects of programming like code readability, patterns, and abstractions could all be taught in a fun interactive way like these problems are done in. This site just focuses more on the math and logical thinking parts.
It took a while before programming really "clicked" with me too, and I started with BASIC on an apple ][. It probably wasn't until a year or two into CS that I could approach a problem in an organized manner.
Now after decades of exposure, as my CS algos professor prophesied, it takes me a week or two to pick up a new language.
Welcome to the New World, the education system in America is broken especially when it comes to teaching evolving topics like programming. Imagine what the world would be like if instead of the 30hrs Americans spend on average watching tv a week, they instead took that time to learn and create.
I imagine it being much more competitive for us if it was the case. I'm actually very happy that not many people know how to code or have the interest in it. It almost feels like we have unfair superpowers.
"In a way, the ORIC-1 was so mesmerizing because it stripped computing down to its most basic form: you typed some instructions; it did something cool. This was the computer's essential magic laid bare. Somehow ten or twenty lines of code became shapes and sounds; somehow the machine breathed life into a block of text."
Oh, yeah. I still remember that feeling from Atari 65xe and then C-64. Now it's much harder to get such instant gratification. While trying to revive my passion and learn new stuff as an adult I also failed several times, because I set my aspirations too high and wasn't able to quickly learn all the necessary stuff at once :) Nowadays the path to "cool things" seems to be much harder, until you adjust your personal definition of cool.
I got that feeling recently learning iOS. I had previously taken many programming courses where everything happened in the console, and to go from that to storyboards and the robust UIKit frameworks got me hooked. I've been teaching myself and have slowly progressed from newb to relatively competent in the last six months. I still have a lot to learn, but I'm enjoying the learning process in a way I never thought I would.
I think a lot of people have a preferred "style" of development, and some are unfortunately deterred when the first they're introduced to doesn't quite suit them.
Take for example MATLAB vs. Scheme vs. C vs. Java vs Python. All very different ways of programming, all very valid, but everyone has their own style they prefer.
I think today you get that kind of interactive computing with something like the IPython notebook. You can load an image, apply an effect on it and immediately see the result. You can lookup documentation interactively too.
If anything, I think nowadays there are too much cool things, and the challenge is figuring out what can be achieved with them in a small number of steps, instead of getting lost.
Yes, Python seems to be the closest thing to BASIC and the abundance of cool things is really distracting indeed. Thanks for pointing me to iPython, I haven't heard about it before.
You should check out a modern Smalltalk environment like Pharo so as to find this interactivity taken to its ultimate conclusion (relative to what we have).
> If anything, I think nowadays there are too much cool things, and the challenge is figuring out what can be achieved with them in a small number of steps, instead of getting lost.
And which are actually important to know vs. which will fade in less than a year.
Cool things are rarely important to know, even if they stay cool for a long time. The important thing when starting programming is to have fun. You don't need to keep the knowledge. When I was learning programming in BASIC on C64 I knew that's not the language people write real programs in, but it was still fun and it thought me to think analytically and gave me some technical basics.
I always felt weird about this kind of math, because problem don't have an "API", or an algebra, it's too open. But if you're into creativity and search then it's even more fun. Maybe a psychology thing in the end.