Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Bill Gates on the future of education and programming (gigaom.com)
54 points by wx196 on July 20, 2013 | hide | past | favorite | 31 comments


I think what we need today are competent generalists - super-teachers if you will - who have deep (i.e., at least post-graduate level) proficiency in what are currently considered separate fields such as Physics, Philosophy, Mathematics or Biology. There's a goldmine of new breakthroughs waiting to happen at the intersection of disparate areas of knowledge. Generalist teachers/professors at the first/second year university levels can help the next generation be more adept at recognizing the connections between fields of study. Currently unfortunately you are not recognized at higher-rungs of academia unless you super-specialize and churn out papers. We really need to acknowledge generalization as well, even when it happens at the cost of any original contributions.


This.

This is also an argument against doing a PhD where you are expected to focus on a very narrow field. The problem is that we want to measure academic progress by numbers in form of papers so scientists need to publish a lot otherwise they run out of funding. But if there is always this pressure to publish there is less time to step back to see the big picture.


The scientists who do that are lazy then. Publish less, but high quality papers, reflect a lot more. Ya, it's tricky, but then why are we doing this anyways?


Wow ... this is so off the mark, it is hard to come up with a coherent response. Scientists don't operate in a vacuum - if everyone around you publishes two strong papers a year and you don't ... well, you aren't going to last very long. How do you judge quality? I've had friends who are far smarter than me, worked on more important problems, yet failed out of academia just because idiotic reviewers didn't appreciate their work. Science has fads, and scientists as a whole act worse than high schoolers. The publish early and often mentality that is prevalent today is the same as how aspiring high schoolers compete for spots in the Ivy League. Seriously ... 16-17 year olds writing books? founding international charity organizations? All this is great stuff but really ... what is the motivation behind it if not to have a great college essay? If a high school student were to merely focus on his course work, study diligently and try to have a balanced life, would that kid get into an Ivy League school? That is the bar that academics must deal with. The difference is that if you don't get into an Ivy League university, life isn't over. You go some place else and keep up the good work. When you fail out of academia, it is far more challenging to integrate into the regular work force. Some of my friends who are thinking about getting out of CS research are mortified ... the technology landscape seems very alien to someone who did their undergrads during the late 90s-early 00s. sigh.


Your response definitely isn't very coherent. If that is my fault, I apologize.

As an almost-academic who did his undergrad in the mid 90s, I'm not too afraid of getting out of research, though it's a nice place to be since we have so much freedom. It helps to be in a less theoretical field.

The reality of academia, especially in computer science, is that the system encourages a lot of bad behavior like crap publications. It is the job of the scientist to work around this bad behavior and still try to do good...tough yes, but necessary. If you can't juggle this, then failing out is more honorable than degenerating to fit the system, which would be a quite pointless existence.


Apology accepted.

<quote>If you can't juggle this, then failing out is more honorable than degenerating to fit the system<quote>

Science isn't what you think, or I think. Rather, it is what the community thinks to be true at any given point in time. If the community as a whole judges by quantity instead of quality, that's what most scientists will conform to. It is important to understand why this is the case. I believe the primary reason is that quality is extremely hard to judge. Most conference program committees will tell you this. You can easily spot the diamonds and the turds. How do you sort out amongst the rest?

I also disagree with your overall attitude in which you consider academics dealing with their reality to be a "pointless existence". However, you are certainly entitled to your opinion as I am entitled to disagree with it :)


The established part of a community wants a system that protects their best interests. "Dealing with reality" is not a pointless existence, but just pandering to and playing the system is. The system must be worked or worked around for good (whatever you think that is). This makes me happy at least, but to each their own rationalizations.

I have been on enough PCs to know how it works. Sometime that diamond is just a well written turd, and sometimes there is a diamond hiding inside the turd. Reviewing is much more than just sorting the rest.


> "A skeptic might say that’s like robbing from the not-so-rich to give to the poor."

More like 'indoctrinating 90% of the population in one sphere to demand the substandard, then profiting massively, then flailing around wildly with charitable work to try to cover your shame"


Personally, I just see that we're in a world where super-elites decide when and how to allocate resources. Many don't have a right to life, except to the extent that it fits in a framework acceptable to elites.

For instance, with healthcare, Gates explicitly supports intellectual property regimes, like medical patents. This is in keeping with his decades-long ideologies, and vital to how he went beyond his wealthy parents. (http://newint.org/features/2012/04/01/bill-gates-charitable-...)

Or with education, helping push privatized schools. (http://en.wikipedia.org/wiki/Bill_%26_Melinda_Gates_Foundati...)

I'm not really discussing Gates himself. (Despite him putting his name in the forefront of his actions and press releases.) I imagine his charity is far better than how the Koch brothers allocate their wealth. Rather, these points apply to depending on the benevolence of elites in general.


Now this is completely hive mind, unnecessary, off topic, and not really related to the OP, but:

> I frankly don’t see that much of a downside.

I think Windows taking over the world from 1995 to 2008 has absurdly destroyed lot of hacker interest in computing devices. I'm talking about those born around 93 - 96 that grew up during complete Microsoft dominance. There is no (good) terminal, until recently the development environments and toolchains were behind paywalls or not included, and terrible habits like "reformat when something breaks" emerged because of how undocumented and malignant a lot of DOS / NT's behavior acted. By raising a generation on closed platforms, they completely avoid realizing the inherent mutability of internal systems in these devices, and I think this promoted a huge amount of the computer illiteracy we see rampant today. Microsoft did give people what they wanted - brainless easy computing that takes no thought and was effectively consumable and disposable - but at the cost of a lot of engineering potential if they had distributed a tinker-able sandbox rather than a black box. Rather than be knowledgable about the workings of their devices (which are more and more taking over their lives) they are dependent on them but know nothing about them besides how to smack the keyboard or tap the facebook button.

Bill has done a lot of good in education outside this, but the undercurrents of the Microsoft takeover of consumer electronics for 2 decades will have lasting negative implications on computing for probably an entire generation. We don't know what the alternative might have been, but I know from my peers (I'm 21) there is an absurd amount of illiteracy and apathy to these devices because they were raised on Microsoft products and expect it to work or just replace it, rather than hack it to fix it. This doesn't even start on how the majority of web devs seem to be 25 - 40 explicitly because they grew up on netscape, telnet, etc and not IE. I see a firm line right around where XP came out when the entire browser space collapsed into IE where anyone currently 15 - 20 I know had a significant drop in web tech interest as a result.

> “Anybody who thinks getting rid of [patent law] would be better … I can tell you, that’s crazy,” Gates said. “My view is it’s working very well.”

Patents seem to still work (due to their short duration), so I'm not arguing patents, but copyright has destroyed a supermajority (I see estimates in the ballpark of 95%) of media and content created for the last hundred years because it all died and all copies were lost while still outside the public domain. There is a reason all modern media takes its roots from 16th - 19th century media - that is the only place you can reference without landing in a lawsuit minefield.

However, I see no reason at all why all this nonsense can't be abolished and culturally we could move towards a systemic crowdfunding approach where people propose ideas, everyone invests in the creation of their ideas, and the result is inherently public domain. The creator eats, the public benefits from any idea someone may have, and we don't end up with a huge fraction of culture and innovation lost under a rug of time.

I love Bill Gates for the good he does with his money, but I'm not going to blindly agree with him just because hes a genius or because hes rich and popular. I think Microsoft had a lot of systemic societal damage, and that IP law is completely out of control and unnecessary in this day and age.


I disagree.

As a counter-example: I am 36 years old and have worked professionally creating software since I was 20. Sure, I tinkered with BASIC programming on my first computer (the fabulous ZX81), but I first really got into programming in the early 90s with Turbo Pascal/Turbo Assembler in MS-DOS. In my world, Microsoft was completely dominant at this time.

My own little pet theory on why younger people don't seem to do as much programming:

Being online and having so easy access to so much information, entertainment, communication etc has generally made people more impatient and less willing to really commit to the kind of single minded focus that is needed to get into and really develop programming knowledge.


Intuitively I want to agree with you, as I've had similar thoughts , but is it really the case that younger people don't do as much programming? I'm around your age and almost nobody did programming when I was a kid, or even when I was first getting into the workforce. Now it seems like there are hacking meetups everywhere, millions of people posting their code to github, and more young people than ever to compete with for work.


We'll nobody programmed in general since computing was so new. The industry is just significantly larger now. The trend though I think is that computing has gotten huge but actually developing that computing hasn't grown at all in the youth group, just the consumption of that computing. You lose a lot of tinker-ability when you move to GUIs and especially when you don't distribute developer tools or learning resources.

I mean, think of the generation growing on iphones. How would they ever get exposure to programming or tinkering on a device like that?


Paragraf is a great app to make graphic shaders. It's easy to use touch x/y coordinates and the camera feed as a texture.

The results look beautiful on retina displays. I've gotten several kids hooked on it, even overcoming some of the GLSL syntax hurdles to get their programs to work. I also gave them cheap $5 jeweller's loupes so they could inspect pixels as the program runs.

Lots of kids genuinely want to create stuff.


Yeah, it would be nice to have some hard data. My (probably quite random) data points are mostly from being involved in recruitment over the last ten years.

I also didn't know anyone who was as much into programming when I was a kid. Now that everyone are online it is so much easier for people into particular things to find each other. It's not all bad, perhaps. :)


I'm not arguing there aren't exceptions, and maybe I should rephrase the time period - I'm talking about kids who never had DOS. They started their computing experiences on GUIs in Windows 95. I started on dos at around 2 - 3, and I know my personal interest in computing was heavily developed by the need to use a dos terminal for a few years. I know my younger brother, his peers, and pretty much everyone else I grew up with (not many 2 - 3 year olds played on a computer, but I liked playing Wolfenstein at that age) don't know what a terminal is, what code looks like, how to do terminal piping or navigate with terminal commands. Because they never saw it.

I'm also not arguing there aren't exceptions, but my older cousin grew up on the c64 and spent ages 5 - 10 basically hacking on that thing. Tremendous growth experience in his eventual CS education. It went from upper middle class families owning a tinker box for their 5 - 15 year olds to upper middle class families signing kids up for facebook and cell phones before they hit 10.

> Being online and having so easy access to so much information, entertainment, communication etc has generally made people more impatient and less willing to really commit to the kind of single minded focus that is needed to get into and really develop programming knowledge.

I think they are easily distracted. I know I was, I went from scripting nwscript persistent worlds at 12 in NwN to playing wow all the time. I learned some lua there, and tweaked some addons, but I was effectively stuck on my programming experience from 13 - 18, though that was mostly because MMOs were a great escapism from horrible public education settings, and I liked turning my brain off at the time since I never really had a reason to turn it on.

But as soon as I hit my senior year of high school I got my act in gear, mostly because I was tech-savvy from my earliest affiliations with computers, and against what Microsoft wanted me to do with my devices (hacking them). I set up dual boot Ubuntu, learned Python / Java, got my BS, and I've only gone down the rabbit hole since then. I've been soldering recently, run Arch, use terminals all the time, I recently did a cmos reset over 3.3v current on a router.

But that required a spark that only really came from my most developmental years having to interact with computers as tinker boxes rather than colorful graphical click squares, and few of my peers got that experience.


I agree.

Since the era of Microsoft dominance, we've done a fairly good job of selling appliances and single-purpose software & services, but I think we've failed to sell general-purpose computing to the public. We've barely even tried.

It's a shame. Not only are most people missing out on the amazing power that general-purpose computing brings, but we're now in a situation where people are willing to cede that power to copyright maximalists.

Arguably, "the coming war on general-purpose computing" wouldn't even be a possibility right now if the industry (led by Microsoft) hadn't raised such high barriers between "developers" and "users".


I don't know about this. Google glass and Android, while not stellar platforms for the tinker, have been open enough to allow FOSS to trickle in around the cracks, leading to things like Cyanogenmod or the 4 FOSS phone platforms (tizen / sailfish / ubuntu phone / firefox os). I think those will keep open platforms for general computing a thing, even when businesses desperately want people using locked down platforms accessing cloud storage and server side applications to maximize their influence.


"We don't know what the alternative might have been"

I can make a guess: if Microsoft had refused to give people want they wanted, somebody else would have. After all, Windows more or less started out as a "copy the Macintosh, but keep backwards compatibility" exercise. IBM could build that, too (and did, with OS/2, which was even more closed than Windows) and chances were that Digital Research [edit: make that Novell; they bought DR in 1991] could have, too. What could Microsoft have done about that? Stubbornly selling something that users do not want would, IMO, just have led to a different winner.

I also fail to see why you would ever expect the average computer user today to be as knowledgeable of computers as that of 20 years ago. The demographics of 'computer user' have changed enormously over that time period.

Similar things have happened everywhere. We have people washing clothes who could not build a washing machine, people doing a bank teller's job with the help of an ATM who know nothing of double entry bookkeeping, people driving cars who don't even know what engine knocking is, people heating their home who don't even know how to make a fire _with_ a match, etc.

And yes, I think people who don't really know how their system boots or how a file system works miss out on a lot, but then, I don't really understand the underlying physics either.


I'm not arguing consumer computing is bad. I think it is great, and you are right, it is inevitable. I'm more talking about how Windows of the 95 - 2008 period didn't ship with development tools (which were also behind non-student paywalls), terrible sysadmin tools, and nothing pointing a user of any Windows computer towards hacking on it. It was presented as a washing machine (mainly MS-office or MS-IE box) when it can do a lot more than wash clothes (general purpose software).

There is a generation where a lot of potential techies were lost to their day to day compute platforms not having easily accessable or discoverable development platforms the way a c64 or Mac2 would have had (included, by necessity). I started on dos, which was one of the major learning points for me - as soon as Windows adopted a gui, you never experience the actual terminal of a computer anymore. Great for the consumer of a tool, terrible for a generation of unknowing would be hackers lost to a black box technology.


> I think Windows taking over the world from 1995 to 2008 has absurdly destroyed lot of hacker interest in computing devices.

I'm 32, so not really falling into your description when it comes to age, but if it weren't for Linux I wouldn't be here today writing code for a living. I was around your age and I was almost illiterate when it came to programming (but very good at maths and all), until I installed a Mandrake Linux distribution on my PC and then discovered Python.

Like you said, even having access to a decent computer terminal can make a huge difference, the difference between just typing "python" and having access to its REPL and on the other hand struggling to set and export some crazy PATHs in DOS (and to this day I'm sure I discovered Copy/Paste into the DOS terminal by mistake).

It's good that Bill Gates does the things he does now and I understand why people would look up to him, but we should also think of the huge opportunity costs that Microsoft's policies imposed on the rest of the world.


Microsoft shipped development tools with it's OS for as long as I can remember using them. Typically tools with very low barriers to entry too, they just worked out of the box and usually came with sample code.

DOS shipped with QBasic, Windows 98+ shipped with VBScript. Win 98 even included a small web server with ASP support and databases via Access.

Not to mention Visual Basic, which yes was technically not free but I don't remember it having meaningful copy protection and pirate copies were handed around in geek circles like photoshop is today.

There were also a bunch of third party drag and drop game creation tools available, many of which were scriptable in some capacity.


I disagree completely, I fit that demographic that you say is "computer illeterate" (I'm 19) and I have been interested in programming since 12 years old. Microsoft wasn't the only influence during that time. Valve was too.

I learned most of my programming through the wonderful tools that Valve gave its users for the Source engine. If it weren't for Valve, I probably wouldn't be interested programming today. My family has no computer background at all. I learned this all on my own. My dad didn't even know what Windows was until I explained it to him.

The point I'm trying to make here is that sure, Microsoft sucks, but you can't blame a single corporation for a trend so massive. There are other platforms. There WAS a good terminal. The in-game Counter Strike: Source console. And Valve's hammer editing is and was incredible for level editing. And have you seen Garry's Mod modding infrastructure? It's all open source (using Lua), and the community is super supportive and helpful. You're just looking in the wrong places.


You are making a classic economic error, here, of seeing a company with good marketing that grew huge by selling people something, and assuming that it is that particular company's fault for causing the market to want that something. A much simpler explanation for the evidence is that people already did want that thing, but had nowhere to get it; and that the company grew huge by responding to untapped demand.

People, by-and-large, like magic. People want appliance-like technology that they don't have to understand. It is not Microsoft's fault (or Apple's, or Google's, or IBM's, or Nintendo's, for that matter.) People have wanted appliance-like computers since before there were computers.

The computer-like devices in speculative fiction novels from the early 20th century (written by, and for, the nerdiest of nerds) almost never resembled "general-purpose computers"; they mostly envisioned something roughly combining Skype, a wiki, and Siri. Englebart's Mother Of All Demos is a good example of what people were dreaming about: fluid communication and information sharing/retrieval. "Programmability" was never really a feature given much concern, even by the most idealistic futurist. People just don't want programmability. They don't lust after it like they do other whiz-bang features. They don't dream of a world where your tablet can be used to write programs that run on your toaster.

As for why...

As far as I've observed, programming a computer is, to most people, a mind-numbing consideration; a complex task, with a lot of rules to understand, facts to memorize, considerations to hold in one's head, and requiring heavy rigor. It's Engineering, in other words.

Most people aren't engineers; there is a specific mindset required, and most people don't seem to like being in that mindset, let alone staying in it for long enough to get good at using it. It seems to come down to "logical thinking" (or "System 2" thinking[1], or "Formal operational" thinking[2].) From what I've seen in trying to teach people to program, for most people, it hurts--possibly as far as causing an actual headache--to engage in rigorous, conscious logical analysis. People both try to avoid it for as long as possible, and to escape it as quickly as possible once they've started.

We--I don't know which "we", maybe "hackers" or "engineers" or "nerds" or some other term--we here don't seem to have this problem. Logical thinking comes simply to us; we get jobs requiring almost constant use of it, and then we engage in it in our off-hours as well. We might never stop using it. We are a minority.

This is why I say that "programming literacy" is a bad metaphor for becoming skilled at programming: "literacy" is something that (excepting learning disabilities) everybody is equally capable of attaining. Imagine if 90% of the population--forced to practice reading for years--could still only manage to read five or six words per minute, and doing so for 20 minutes or so caused them to need to take a walk outside to "clear their head." That's what happens to the "bottom hump" of the bimodal distribution in a CS101 class--and they're the ones prequalified by being in a CS101 class; the population at large does even worse.

So, there is no reason the average computer needs (serious) programming tools pre-installed on it, since there is no reason to expect the average person to want to be a programmer. Sure, programming tools should be made available to anyone who wants them, from a young age--as should supplies for any other craft a child might want to explore (paints, a keyboard, clay, a chemistry set, etc.) But every computer these days has the internet--and most of them even have app stores!--so it's easy enough to find, and then download, something like Shoes or Scratch if you want to play around and learn. I could even see including a learning environment like that (or maybe something more like Hypercard; if only~) with the OS, for the same reason Windows ships with MSPaint, and OSX ships with Garage Band.

But shipping Visual Studio, or XCode, or even gcc, with a computer, is more like giving everyone AutoCAD than giving them MSPaint. It's silly and overkill: only 0.1% of the population would even know what it was if they accidentally managed to get it open. Of those, most of them wouldn't be at a level to understand it anyway; they would need something easier to start with.

(All of this is separate from the "the system relies on the compiler toolchain to build or update parts of itself, so it needs to be there" argument. You might need to ship gcc with your OS, and might even use it to compile packages the user installs (i.e. in Gentoo), but that doesn't mean the user should be aware of--or care--that compilation is going on. Nobody notices that the .NET framework ships with a compiler (ngen.exe) which is then used to locally compile all the .NET stdlib modules in the background after installation; it just happens. You don't think "I have a CLR compiler on my system" when you install the .NET framework; you only think that once you've installed MSVC, because that gives it a user-accessible UI.

The same goes for "system runtimes"--the only thing anyone is recommended to do when their OS includes a copy of Ruby (usually for the sake of executing system scripts written in Ruby, not for the user) is to install a fresh copy through RVM and avoid the "system Ruby" like a plague. If it's so distasteful to the language's programmers, why are we as OS builders even exposing it to them? Stick the binaries in sbin instead of bin; that's what it's for.)

[1] http://en.wikipedia.org/wiki/Dual_process_theory

[2] http://en.wikipedia.org/wiki/Piaget's_theory_of_cognitive_de...


I'm not saying the demand for personal computer as a click-able facebook or spreadsheet machine wasn't a demand MS created. But what MS did was heavily obfuscate its platform's inherent general-purpose compute functionality by having no standard script language besides batch files (I'm unaware of if 95+ had a visual basic interpreter). They didn't provide a compiler - hell, they changed you for it - until the MSDN happened around 2000, development material was sparse and expensive (compared to man pages existing for 30 years before that).

My point is the barrier to translate from interactive brick tool -> hackable general purpose compute platform was a huge divide for 15 years because Windows tools were hard to come by, hard to find, or otherwise inaccessible.

Also, I don't buy a concern about space consumption about runtimes. Microsoft was profiting off of selling MSVCC for two decades and plenty happy with the pocket change. You can pack mono, java, python, ruby, lua, node, perl, gcc, clang, and a beefy IDE on a linux install cd and... (checks)...

on Arch, all those packages combined consume (in order) 45.2 + 15.8 + 13.9 + 3.7 + .2 + 31 + 13.6 +20.3 + 21 + 9.8 (codeblocks) = 174.5 MB. By comparison, the libreoffice package contains a combined 345MB of archives. (pacman -Si <package names> | grep "Download Size")

So every dev tool under the sun takes up half the space of the office environment on Linux, and if I wanted to I could throw both those, and kde (full package is 450MB, wallpapers add 80MB) and still use up around a gigabyte. Windows 7 iso (I have a copy) is 3.3GB. So 3 times the space, spend on what, I have no idea. It doesn't even come with .net 4.


> They didn't provide a compiler - hell, they changed you for it - until the MSDN happened around 2000, development material was sparse and expensive (compared to man pages existing for 30 years before that).

It looks like you're talking about the UNIX tools that existed for "30 years before" the year 2000.

The problem with that kind of thinking is that for most of this period, those UNIX tools required a $50,000 minicomputer or a $5,000 workstation to run them on.

In comparison, the PC made it much cheaper to get access to programming tools -- even when you add in the cost to buy a C compiler separately.

It was only when commodity PCs running Linux took over the UNIX world, in the late 1990s, that those man pages actually became cheap.


You could buy Borland Turbo Pascal or Turbo C and hack away. They weren't free but were pretty affordable in the context of software of the time.


> the barrier to translate from interactive brick tool -> hackable general purpose compute platform was a huge divide for 15 years

I'll admit: back in the 90s, without the Internet, it was harder to bootstrap a programming environment on Windows (though in 1985 batch files weren't all that bad, relatively speaking.) A lot of the time, though, you'd be learning to program from someone--a friend, or a teacher--and they'd give you a compiler on a floppy/CD. Programming books also came with compiler toolchains on disks in the jacket sleeve quite a bit of the time, if I recall (usually DJGPP.) I'm not quite sure how you'd be learning to program but stuck for the compiler part.

But I started really trying to learn to program in 2001, not 1985, and I had Internet access. By 2001, it didn't matter what Microsoft was or wasn't doing. I could download MinGW and code some C within an hour or two. You know what the hardest part of that process was? Learning C. The "barrier" of finding a C compiler on the Internet, downloading it, installing it, etc. was immaterial, even then. In fact, the biggest "barrier" was figuring out that the thing I should be looking for is called a "C compiler"!

But I didn't necessarily need a C compiler at the time, really. If I had wanted to code VB, WSH was already on the machine. If I had wanted to code Javascript, IE would have let me. And if my copy of Windows came with Office, there was always VBA, too.[1]

Today, of course, it's even easier. Ignoring the actively-didactic software like Shoes or Scratch or Squeak, you can just download Light Table and twenty seconds later be coding Clojure. Or Love2d and be making a game in Lua. Or you can code "in the cloud" in your browser without downloading anything, if you like. Windows 8 ships with Powershell installable from the "Programs and Features" menu, I believe; and there are several compiler/IDE/whatevers in the Windows App Store.

In short, you can be programming as soon as you want to be. The hardest part of programming on any OS, for the last 15 years at least, has just been wanting to--and then putting up with all that rigorous, logical thinking.

> a concern about space consumption about runtimes

I didn't say anything about space consumption.

- Every package an OS includes in the base install is a package they have to security-audit, update, and support.

- Every base package is something a user could call in with a problem regarding, and so something customer service reps have to understand.

- Every base package is part of the OS's attack surface.

- Every base package is something developers will presume to rely on to be part of the OS in the future, and so the OS maker will have to commit to supporting indefinitely for as long as they support the OS.[2]

- And every base package is something that other internal developers might build other components of the OS to rely on.

In short, there are good reasons to ship as little with your OS as you can, reasons that have nothing to do with the size of the resulting installation.[3]

---

[1] I once programmed a JRPG-style game as a PowerPoint presentation using slides for views and VBA for controllers. Not as bad as you'd think; worked a lot like Flash + ActionScript, with the same "timeline" concept, but without an actor model.

[2] If Microsoft shipped Windows with a VB4 IDE, there would be programs, even today, wanting to go in and diddle with that IDE over COM to "programmatically" generate another program. Microsoft still supports DDE in the OS in part because there were so many programs written that used it to diddle with Excel workbooks--and Excel is an optional package.

[2] Despite that, I should note that installing the latest Visual Studio takes something more like 10GB than 174MB. You don't install Visual Studio on a computer you're not going to use for Visual Studio, even if the DVD is laying right there. It's not a "freebie" to give someone when setting up their computer, like Google Earth or VLC. It's more like VMWare: it takes over your computer and puts hooks everywhere, because it needs to do things like trap segfaults and handle them itself, or test-install services nobody's codesigned yet, etc.


I tried to learn Java when I was kid and spent almost 3 hours trying to get Hello World to compile. I tried again for an hour the next day, and couldn't do it. I also didn't understand what words like "void", "static" and "main" meant. It was all voodoo mojo I didn't have in me. I thought I must have not been talented enough to understand these words.

You need to compile a program to get any feedback at all.

I did learn some HTML and JavaScript, because it was so easy to get started. I made table-based layouts and all that. But JavaScript wasn't a serious programming language at the time-- I really wanted to make a Java applet. The toolchain was just too complex and intimidating for me, so I gave up. Years late, I picked up the old Java book again and realized it was no good anyways.

I could have learned C from K&R when I was a kid, especially if I used Linux. I was always interested in Linux, but my family was paranoid about "breaking" the computer. I would have also been plugged in to open source culture at a much younger age. With Windows, it was all about freeware.



Kind of funny that Bill Gates would have reverted his position on patents. (But maybe Gigaom is misquoting there, maybe he is referring to "intellectual property" in general.)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: