Somewhat related: Chris Smith is teaching kids programming using a tiny subset of Haskell, the Gloss graphics library, and a web framework that he built around Gloss, using the SafeHaskell extension.
His experiences can be followed through his blog. Previous posts:
You could also mention the Ruby-based Hackety Hack, a tutorial environment built on top of the Shoes GUI toolkit, targeted at tweens. You don't have to use toy languages.
Once they are proficient with the tutorials, students should be able to develop full-fledged Shoes apps which can be packaged as stand-alone executables. Shoes makes it extremely easy (and fun) to create and share games and other graphical programs.
Yeah, or they could not. HH is a great idea, but with a tutorial that most people it's targeted at can't follow.
It's the kind of thing programmers love, "look, it teaches people to code using my favorite language!", but doesn't actually do anything to teach kids to program.
If you want to teach someone to code, start them off with Scratch (or something similar), then pick a language to start with based on their intentions. If they want to learn gamedev, for example, give them C++.
Yes, I fully expect this to be downvoted to hell and back because of all the _why worshipping on HN. I'm fine with that.
Yes. I recommended it to a few people squarely in it's target market and not a single one could understand HH, and not one even wanted to use it. The same people then learned various languages through different means and had no problems. To me that is pretty damning.
> What we don't need is to teach a specific language. Any language will do.
Except BASIC. Anything but BASIC.
If I had to pick a language to teach 9-10 year olds, I'd probably go with Python. Of the 'real languages' (by which I mean languages substantially used in industry/academia for real applications), I think python has the least overhead to get your head around before you can write functioning code.
Python sure. Moving into... Ruby possibly, or potentially PHP (if you combine it with one of the better/easier frameworks) and then HTML.
The latter ones because you can start to get them to deploy/work on websites. Being able to write websites is a generally useful skill nowadays (even if you don't go into programming.. all my family need/want websites, for example).
I teach an after school IT club and a project to maintain/develop a site on the school intranet has consistently been the favourite :) Because it's something functional and visible!
(and also a good opportunity to teach about security issues what with the trial-by-fire environment a school brings :D)
EDIT: the other good thing about web-dev is that it needs a few different skills - design, back-end coding, infrastructure/sysops and databases which a) gives them a taster of all the career opportunities and b) lets different people "shine" in different areas, and contribute their strongest skill to the final project.
When you get to know Python, you don't need to "move into" Ruby or (shudder) PHP. It's much easier to learn a web programming framework in Python and how to deploy it, rather than to learn how your second programming language differs from Python.
The sad state of client-side programming is that you need to learn a lot about the messy world of HTML, CSS and JavaScript. Come to think of it, CoffeeScript might be a good language to start with today?
You know what's fun, programming. You know what's not fun, general problem solving. You've got it all backwards. I learned to program when I was 8 years old reading books and modifying games in BASIC. It was a great time. I never considering it work.
I was the same (well it wasn't BASIC, I was 8 in 1998). Programming is fun, and I never considered it work, but I don't think you and I were typical 8 year olds. Not sure if general problem solving is more fun, but you're implying 'programming is fun for 8 year olds', which may not hold true for all 8 year olds.
I was blessed to have a very good highschool computing science teacher -- the class was a good mix of boys and girls and those experienced and inexperienced and in the end everyone produced some really neat games and had a great time.
I've taught my daughter a little bit of programming when she was younger -- I tried her with Visual Basic and we built a app that did some computation -- and she really enjoyed it. It gave her an appreciation for what I do.
So I really think that programming is generally an enjoyable experience. That's not to say we aren't exceptional, but there are exceptional people in all sorts of pursuits.
Your 4yo knows scratch? I'm trying to get my son interested in programming but it's been frustrating for both of us. He is a lefty in house of righties so the mouse never works the way he wants it to.
If he doesn't make a good programmer he'll be a master mechanic. I've given him a precision screw driver set and he's gotten quite good at taking things apart.
The interface is really a bit too hard, generally she have to explain to me when she want to do something more complicated, but she can drag and drop stuff from the sidebar to the active area and rearrange the stuff there, also entering numbers is fun generally she wants all numbers to be at 100.
I would really want to have a scratch version for the ipad. The graphical visualization of programming is excellent, but it's really too complicated with a mouse. Also, painting with a mouse really sucks when compared to painting with a touchscreen.
Looking at her with the ipad, the mouse really feels like a legacy device.
He is a lefty in house of righties so the mouse never works the way he wants it to.
You didn't state your son's age, but if he is around 3 or 4yo and you have a roller-ball mouse, it might help to teach him to put the mouse in his lap and manipulate the roller ball with his thumb. The scale of the mouse has a small child moving their entire arm instead of just hand/wrist motions. This was something my youngest son did when he was 2 or 3 and it made computers much more accessible for him and he became much less dependent on me doing stuff for him.
The petition itself is marred by a rather confused statement of its aims which seem to cover teaching children to code and doing something about the gender gap in IT.
This was my first thought too. A noble aim to be sure, but it would be nice to see it cast with broader aims!
I agree, although that didn't stop me from signing the petition. When I was at school I was told, "I.T. isn't a real subject", and was prevented from taking it. It was seen as a drop-out's subject.
That kind of attitude has to change, and although I think the wording of the petition could be better, I'll sign anything in the hope of getting this issue noticed.
But I.T. isn't a real subject, at least not how I was taught it. The mandatory course I took at school was essentially a certificate that says "This person can work the basics of the Microsoft Office software suite". No-one in the class didn't know everything already, since we already used Word, Powerpoint and Excel for our other work. If they taught some actual Computer Science rather than just computer use, it would be a real subject in my opinion.
I think the former has value to people with no aspirations in CS (a lot of value, really, everyone needs basic IT skills in the modern world).
But you're right... they need to divide them - so teach "Basic IT Skills" (or w/e) and Computer Science as separate courses. A lot of the kids I teach are really put off by their IT classes because it is all about competency and not creativity.
"I think the former has value to people with no aspirations in CS (a lot of value, really, everyone needs basic IT skills in the modern world)."
Well yes, but let me tell you a little story.
One of the things that I get to notice, as a woman, which my male colleagues don't, is just how repetitive womens' work often is in the office. I have a lot of female friends that are secretaries, receptionists, assistants etc, which I don't think is true of most men working in IT. And I see how much of the daily grind of what those women do could be automated, if only they knew how to put together macros, or even just how to make use of tools like Automator on the Mac. I have tried teaching macro programming to a couple of them, but they get stuck on the simplest of programming concepts. What's a variable? How can I expresss a complex algorithm given a small set of primitives? Flow control, what's that?!?!
I can't help thinking that if they had learned these very simple basic concepts when they were back learning about how to set tabs in Word, that they would be many times more effective in their office jobs today. So even if someone has no interest in CS as a career, the reality is that these days many, if not a majority, of jobs could be helped by it - running a small business, office work, etc. But if you leave it until the person is an adult out in the workforce, they often won't want to learn.
They used to teach computing as an A level at my Sixth Form - however, it really was poorly taught.
No coverage of the material on the test nor was there anything of substance done for the "project" at all (which for most of us turned out to be a horrible access database).
Teaching needs to be improved around the CS curriculum (at the secondary school and college level) before we go about getting kids to do a really shitty CS course.
Our year was the last year to do a Computing A Level, quite a shame.
Agreed with the divide. Technically, back in the early 90's we did "office studies", "mathematics" and "technology" which each had overlapping portions of relevance.
It's still like that today.
It's just with the ubiquity of computing, most people just don't give a shit about writing software any more because they don't have to when doing useful work.
And it's horribly poorly written, which will likely limit the number of people who sign it. But it has the most traction (there's another one that has one signature).
I think that there should be good classes that kids can opt into, but personally I don't think that it should be compulsory. There is a tendency for adults to want their children to be like them (but better of course). Probably every fire fighter wants their kids to follow in their footsteps.
On a personal note, whilst I sit as both 'coder' and 'business' person I enjoy my children learning about what I do without pushing them. What I love to do with my kids is to take them out into the world so that they experience different cultures, discover nature, learn to kayak, ride bikes and.... well, be kids. If they develop an interest in something then I will support them, but I'll not be a pushy parent. I would be delighted if my children followed in my footsteps, but as long as they do their very best in whatever path they choose to follow that will make me the proudest
Lack of exposure to activities like coding is a problem in society beyond HN-types. These activities should be something children are exposed to in school because their home environments will not provide this exposure.
I have some hazy memories of them trying to teach us Logo in the mid 80s but little beyond that.
I think it's a good idea because it's so cross-discipline and can be made 'fun' (doing mostly simplified graphics programming, etc). Things like teamwork, math, music, art, logic, English, and even the scientific method can all combine into a project that's both productive and fun.
Maybe times have changed but I was at school, subjects were strongly compartmentalized and rarely combined (other than the occasional bit of math in a physics class), yet in the real world, combining disciplines makes up most of the work day.
This is a fascinating topic that is always debated. There's always arguments about how young, which language (BASIC, Logo, Scratch, Python, Haskell (yikes!)), which paradigm (functional, OOP, imperative/procedural, logic, etc). I think the point that Eric Schmidt made in the article is that the only wrong move is not to teach anything :) As long as something is being taught about how computer hardware and software works, it's a move in the right direction! His concern is that students are taught only oversimplified computer skills like using a word processor and a spreadsheet rather than understanding some basics about computer science and engineering.
I'm currently teaching introductory programming to non-engineering majors at a major university in C (yes I know uggh, I would have gone with Python). I've noticed what seems to be the biggest problem is that the computer education that was given to these kids earlier in school didn't teach any independent, critical thought. They struggle when things aren't EXACTLY as they were taught to them (e.g. if the Microsoft Word icon is on a different part of the desktop). I've been trying to teach them to use their brains and to try to understand how to solve a problem without asking me for help. I try to make it clear that I do not have the answers to every question regarding computers...neither does any other person on the planet. It requires deduction, inference, experimenting, lots of Google searches, forum posts, and infinite patience. Turns out they can handle it! You just have to let them believe in themselves and let them struggle with it for a bit. In the end, they appreciate it and enjoy it more! I even show them a lot of things that people feel might be too advanced, such as the command line and interfacing with external libraries. They can get it if you can break it down for them. I think often the problem is that the teachers themselves are not experts in programming and have difficult conveying info to students.
I taught my kids Scratch from MIT. I think that is sufficient for the average 9/10 year old. Then the girl went on to learn more on her own and is now programming electronics.
If LOGO is no longer being taught in schools as it seems, then we have a problem. LOGO is what made me love programming. That fucking turtle/pixel. Goddamn.
The problem, I feel, with my generation is that we're too often called 'Digital Natives'. We lull ourselves into believing that we are 'good with computers' when in reality we haven't a clue as to whats going on beneath the surface.
To make a comparison, its like being able to operate a microwave oven versus understanding how a microwave oven works. We press a few buttons and four minutes later that frozen lasagne is piping hot. But from a purely academic point of view, wouldn't it be fantastic if everyone had an understanding of the process in between?
Oh, and list 10 distinct careers where an in-depth knowledge of the Haber process is required, as that is one of the areas taught in GCSE Science for a disproportionate length of time.
I have my 7-year-old using Scratch and she loves it. It's a great intro to analytical thinking & problem solving. Even if she doesn't carry on programming it is a valuable thing to learn for life in general.
First, that is a remark with extraordinarily sexist undertones. What makes you think that she doesn't love it? Have you ever used Scratch, or even looked at it? It's a huge cartoon-and-legos playground. It's great for kids. But moreover, she likes problem solving. But I guess to you, that's not the domain of women or girls.
"He must be thrusting this upon her, because little girls don't love problem solving!"
Give me a break. Judging from your other posts in this thread you sound like a very judgmental, smug & condescending person, exactly the characteristics I point out the folly of to my kid. Plus, you're just way off-base, factually.
No no no no please no. Don't teach them to code. Teach them to think first.
If you give someone a hammer without telling them how to use it, they are going to hit some fingers and get frustrated.
I think we should teach children to understand mathematics at an algorithmic level and let them see that programming is a tool/hammer for executing those algorithms. Unfortunately current educators tend to concentrate on the pure mechanical aspects of mathematics such as arithmetic and punching numbers into off-the-shelf formulae which is the real problem.
Personally, they should just throw in valid use cases that map to reality like calculating GCDs or Newton's method and let them knock up the programs in TI-BASIC on their standard issue TI-84's.
That's enough for school, even to A-level.
Footnote: For reference I was brought up at school in the UK on LOGO, BBC BASIC (6502 and ARM), LEGO Dacta and TI BASIC. Life was good for me, but I know that my peers found it frustrating when they just wanted to solve a problem and go home.
I completely disagree with this, programming is a fundamental skill in a variety of careers and can be used as a tool to teach how to think.
It isn't an either-or proposition, incorporating programming exercises into any science oriented class can not only be illustrative, it like any sort of hands on lab does teach students to think like a scientist and helps to create a more realistic microcosm of how science is actually done these days, with programming forming a bridge between theory and experiment and heavily relied on in analysis.
"We want to establish the idea that a computer language is not just a way of getting a computer to perform operations but rather that it is a novel formal medium for expressing ideas about methodology."
Programming: it's not just a skill. It's a way of life.
Pretty much anywhere people use Excel. Any job where people use spreadsheets and their output is quantified and measured against their peers is ripe for automation by enterprising individuals. Why not be the fastest, most accurate analyst on your team?
From pg: "To get rich you need to get yourself in a situation with two things, measurement and leverage. You need to be in a position where your performance can be measured, or there is no way to get paid more by doing more. And you have to have leverage, in the sense that the decisions you make have a big effect."
And also from Zed Shaw: "Programming as a profession is only moderately interesting. It can be a good job, but you could make about the same money and be happier running a fast food joint. You're much better off using code as your secret weapon in another profession.
People who can code in the world of technology companies are a dime a dozen and get no respect. People who can code in biology, medicine, government, sociology, physics, history, and mathematics are respected and can do amazing things to advance those disciplines."
Letting a kid hit some fingers and get frustrated can be the best way to prepare him/her to care about listening to an explanation of how to use a hammer. I think the reason I enjoyed SICP so much at MIT was because of my teenage experience banging my head against the limitations of BASIC.
Not just for kids either. I realised at some point that one of the best way to learn things is to try, give up and then come back later and try again. By some magical means, the timespan in between will always make it tremendously more intuitive the second time I came around.
I agree. I hated maths in school where you had to plug numbers into formulae. I could never remember them correctly. This stuff should be made interesting so that kids with their natural curiosity will figure out how to do it.
I truly started appreciating maths after I experienced cases where you could code an elegant solution based on math in about 5 lines as opposed to trying to solve it your own way and end up having 30+ lines of code and your program still buggy.
I think we should teach children to understand mathematics at an algorithmic level and let them see that programming is a tool/hammer for executing those algorithms.
Computers are useful for doing math, but for most kids, and even most adults, that's one of least interesting things they can do.
A program isn't just a hammer. It can be a paintbrush, a tape recorder, a story, a symphony, or a lab assistant. And a program is more than any of these things -- a real-life paintbrush can't change the size of its bristles, and you have to draw every stroke by hand. On a computer, everything's just code. You can run your "paintbrush" routine billions of times, vary its parameters using your voice, or earthquake data, or a physics simulation, or anything you can imagine.
As a programmer, maybe I'm biased, but I can't imagine any job beyond the most menial where the ability to program wouldn't make me more productive.
Agree. Kids need to learn logical thought processes and problem solving. They don't need to know how to 'program'.
How do we even know 'programming' will be a relevant skill in 30 years time?
Buy every kid of 3 a tub of Lego bricks (Not a 'set'). That would go a long way to getting them thinking, problem solving, and building stuff.
And don't get me started on the "There aren't enough females in programming" BS rolleyes
FWIW, I would happily sign against this petition. I do not think teaching kids to 'program' at school so early is worthwhile. Programming is an extremely niche career, and anyone who is interested in it can easily learn.
You're likely to bore 90% because they're not interested in learning programming, and the other 10% will have learnt it all at home years ago and will also be bored.
Perhaps if more programmers were working on a solution to cancer, instead of a social webapp to allow people to share photos of their dogs...
Not to mention the number of programmers who waste their time creating new languages and frameworks. Then they spend years rewriting everything in node.js or whatever the fashion of the day is.
> who waste their time creating new languages and frameworks
Well that's just laughable, and says more about yourself than any of these allegedly offending third parties. There has been tremendous progress in both frameworks and languages over the last decade. If you can't see that then, well, I don't really know what else to say, other than to perhaps get off your lawn.
> There has been tremendous progress in both frameworks and languages over the last decade. If you can't see that then, well, I don't really know what else to say, other than to perhaps get off your lawn.
If that's true, can programmers today do things they couldn't do 10 years ago? Can they program faster than they could 10 years ago due to all these innovations in languages and frameworks? I'd say no on both counts.
So you're suggesting that the time to market of an ASP.Net MVC, NHibernate, Autofac, Razor, SQL Server application is going to be lower than a classic asp and SQL server application?
I wouldn't know anything about the stack you describe. Seems to me the barriers there are all Enterprise related; framework innovation isn't the bottleneck. Big business software - perhaps that's something that hasn't improved much, if at all.
But this is a startups related site, in case you hadn't noticed. Web programmer here. Web frameworks have improved immeasurably. If you tried to tell a web programmer that nothing has improved since 2001 you would just be laughed at, and rightly so.
It's not just about the web though. Look at the new functional languages. Look at the message queues, the image libraries, computer vision, sound .. bloody everything. Tell me, is h264 better than mpg videos from 2001?
Step outside your bubble man. In fact, leave your bubble, it sounds pretty depressing in there. This is a very exciting time to be a programmer.
And as for startups, I worked for one once. It's a big business now.
I build software which happens to reside on the web, not "web sites" or "web applications". Sometimes we have desktop applications deployed because they require real-time data. Sometimes we integrate with massive systems with millions of users. We don't use functional languages (they do not suit our workload and don't scale to our recruitment requirements), we use message queues (NServiceBus, zeromq), we use image libraries (GDI, reportlab), we use all the funky nosql stuff (Mongo), we use ORMs (NHibernate), we don't use computer vision or sound because we don't need it in our space.
Big, exciting things happen where I am.
It's the bubble you are in that I'd hate to be in. The superficial one which appears to be based on pushing social crap, novelty iPhone applications and glorified todo lists on the world.
My problem is that we did ALL OF THIS in 2001. The process hasn't changed, but the tools have and the time to market is the same.
But 99.9% of these innovations don't help the world. They help a small subset of a small subset of a small subset of problems from a small subset of a small subset of a small subset of society.
Compare to if someone worked out how to stop Neurofibromatosis dead in its tracks for example (a condition my youngest daughter has), they'd give 1 in 3000 people a better chance.
People get a Jesus complex because they build a tool that a vocal minority uses.
Even if we accept your claim that only 1 in 1000 inventions help the world, which I think is pessimistic, that's still progress. What would you rather people do? Not invent?
A programmer writing a new framework might not directly help cure Neurofibromatosis but they might make it slightly easier for another programmer, which is then inspired to work on his program, which saves a medical researcher a few minutes and gives him the time he needs to have his breakthrough. Or maybe a better search result gave him what he needs. Technology is cumulative. Criticising people because they're not working on your favoured project is pretty lame.
I agree with your sentiment but I see the other side of it. Technology is cumulative but noise slows down decision making and fragments knowledge.
The world does not need 200 programming languages, 1000 javascript frameworks, 500 different web servers and 20-odd social platforms which exist only to boost the ego of the originator who can market their idea better than othersr. It needs some concise, un-fancy-looking tools that are fit for purpose and can be used as a common ground for communicating ideas. There is so much fragmentation it's unbelievable.
Everyone thinks they can do better, yet no-one delivers any real efficiency improvement.
TBH going back in time, I can still deliver the same output as node.js with classic ASP/Jscript from 1998 arguably with less code and time spent.
Back on the subject of med research; they tend to still use Perl, bits of sticky tape and TI89/92 calculators a lot apparently (word of mouth from a friend who works in tissue sample analysis).
> The world does not need 200 programming languages, 1000 javascript frameworks, 500 different web servers and 20-odd social platforms
But .. that is the way we find the way forward. It's like evolution. We may not need 500 different web servers but .. that's a lot of different ideas, a lot of variation, and in the long run, a lot of innovation. Eventually the "best" way is chosen and we go to the next generation.
I actually think the proliferation of programming languages and frameworks, especially open source, is one of the purest meritocracies you can find. They live or die by their quality, innovation, new features. And the whole community is dragged forward. What you are advocating is basically central planning. Did you realise that?
> I can still deliver the same output as node.js with classic ASP/Jscript from 1998 arguably with less code and time spent.
LOL. Dude. No way.
Something you said earlier was instructive as to the mistake you are making with this line of thought. You said that "programmers can't do anything more today than they could in 2001" - or something similar. Or that other guy said it and you agreed, whatever.
Look, that's literally true. In fact it's literally true to say that any programmer can do anything as long as they use a turing complete language. This has been true since the days of ENIAC.
What has advanced, massively, is the knowledge of the best ways of doing things. Design. Efficiency. Elegance, maintainability, scalability. That is what has mainly improved. Knowledge. The laws of the universe have not changed, everyone agrees with that, but we have worked out, a little more, what works and what doesn't.
And I'm afraid your ASP/Jscript falls into the latter category. I'm sure you could mimic some tiny subset of node.js using that toolset. But it would be a hideous mess of spaghetti code. Unless, of course, you used the knowledge you gained over the last 10 years to basically reimplement node.js in ASP, presumably gaining express pre-approved VIP entry to Hell in the process. So things have advanced, haven't they?
Tell me two characteristics or features of node.js that cannot be built (in an old-fashioned circa 2001 way) with either ASP/JScript, Perl/mod_perl or PHP 4?
I'm not interested in "scalability" as that's moot - hardware is cheap.
you should see some of the code that smart, hardworking and logical people end up writing in the sciences because they are 10 years behind in programming education and experience. then the next generation of PhD students rolls in and rinse, repeat.
His experiences can be followed through his blog. Previous posts:
http://cdsmith.wordpress.com/2011/08/16/haskell-for-kids-wee...
http://cdsmith.wordpress.com/2011/08/23/haskell-for-kids-wee...
http://cdsmith.wordpress.com/2011/09/03/haskell-for-kids-wee...
http://cdsmith.wordpress.com/2011/09/10/haskell-for-kids-wee...
Well worth the read, and a good example of how programming can be taught to children.