Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Lost Lessons from 8-Bit BASIC (dadgum.com)
171 points by ingve on Sept 2, 2014 | hide | past | favorite | 80 comments


There is one more important thing here: the 8 bit computer immediately invited you to start coding. The barrier to entry was incredibly low to the point of nonexistent. Compare this to a laptop today or even worse a tablet.

I very strongly believe that perpetuating a "code me!" mindset vs the "consume me!" mindset has really big consequences. The ability to make your computer do something that you can do something with, that the kind of game you bought with your computer is something you could attain as well (which is not necessarily true -- some of those games were incredible feats of programming but still the illusion was there) is completely, absolutely missing today.

And the consequence is: you get (at least the illusion of) the possibility that you yourself can create something that is easily spread worldwide. The HN readers will argue the huge audience of tablets makes up for this but the problem is -- most people will never even think they can do it. That's the problem: did your iPad came with a manual for a programming language? I didn't think so.

Prorgam or be programmed. http://www.rushkoff.com/program-or-be-programmed/

Kids Can't Use Computers http://coding2learn.org/blog/2013/07/29/kids-cant-use-comput... (I know there is a lot of controversy about this article but it does have valid points.)


That's a false equivalence (at least I think that's the right term). Back then, computers were by a huge barrier to entry the exclusive domain to the kind of people who would grow up to read hacker news. Blindly typing in games from magazines without trying to understand them is the 1980's equivalent of "watching shit scroll by for hours makes me a Linux expert overnight". If that gives you an illusion of being able to create programs, it's the same kind of illusion that causes people to post Odesk tasks to build a Facebook clone with Outlook integrated for $200.

Unleashing the power of computing to the general public is an incredible leap forward and arrogantly brushing away the riff raff and their parochial consumption is like dismissing the industrial revolution because who needs cheap cloth when you can get fine lambswool tweed?

Let's not pretend that anyone of us highly skilled programmers could actually build even a small fraction of the tools we turn on and "consume" (even if for the purpose of writing new tools) on a daily basis - if not for lack of skill, although a hugely diverse range of skills did go in to it, then for sheer lack of time.

Yes, we need to demystify programming and teach at least rudimentary skills in school. Taking a few years of a foreign language doesn't make you proficient in the language, and you might never need it, but understanding the abstraction behind the existence of different languages is important and useful in it's own right - in the same way, understanding in the most simplistic ways what makes a computer tic is important for understanding much of the world today.


I know this isn't really your main point, but I think you're probably selling short the activity of typing in code from a magazine. Firstly, the act of typing in and of itself is an advantage in the task of learning to use computers. Secondly, no-one ever typed a program in perfectly, you would have to debug the program when you made a type. Sometimes the typo would create a syntax error and your interpreter would point you directly to the offending line. Other times the typo would result in still valid syntax, but your program would crash, and the interpreter might point you in the right direction, or might not. Lastly your typo might create a bug that just made the program function incorrectly. Hands up everyone that spent a day searching for the line that had a less than rather than a greater than sign typed in! The act of debugging these very real problems (without even talking about errors in the as-printed code!) is a very real introduction to programming.

Lastly, and most importantly, if you were into computers, uyou inevitably encountered code that was designed for the BASIC dialect of another computer. To make it run on yours, you have to read the code, understand what it does, and then modify it to use commands (typically graphics commands) from your computer's own dialect to make it work. At this point you're no longer even looking like you're just typing in code - it's full-on programming. At one point in my life, I carried around the information necessary to translate basic programs between TRS-80, Apple ][, Commodre-64 and TI 99/4A BASIC dialects.

We have lost this heritage. You don't need to type in code, you just download it from the 'Net. That means no slow reading of the code, no debugging typos, and no attempts to transfer code from one environment to another (as most development environments these days have got pretty good at virtualising the underlying system such that it presents a consistent interface to the programmer - my Javascript webpage generally works equally well in Safari, Chrome or IE)


It's not worthless, and neither is "watching shit scroll by for hours", but I think the better signal is in the motivation and tenacity of the kind of people who set out to do this in the first place. In the 80's, this kind of people made up a very large proportion of all computer users out of sheer necessity (heck, in the 70s, you generally had to assemble your own hardware as well). Today, it's a vanishingly small fraction - but in absolute numbers, it's much much bigger, and a much much easier group to enter.


People do still assemble their own hardware and write their own dedicated programs for that hardware. The blossoming micro-controller movement is a lot like PC's were back in the day.


Of course they do. And people write web servers in assembly. But they don't need to do it to use a computer at all.


Also see "Why Johnny can't code" by David Brin: http://www.salon.com/2006/09/14/basic_2/


I have a lot of respect for Brin, but it's amazing how wrong that piece is. There's more code being written now than ever before, not just by kids but by all kinds of everyone.

There's an entire sub-culture of collaborative development - github, etc - which would have been impossible with 8-bit micro technology.

Away from that, a lot of non-devs have hacked together simple VBA/AppleScripts. And not a few people with zero experience have produced apps of reasonable quality.

Never mind web pages, including Wordpress sites.

Of course you can't own your hardware any more. But no one ever did. Even if you built a Z80 micro from scratch and burned your own EPROMs for it, you were still stuck with the original instruction set, and you likely had no idea how it worked on the inside. You certainly couldn't hack it to change it.

As someone who made a brief living building custom Z80 controllers and coding in assembler, I'm thrilled by modern dev tools, and the fact that if you have a problem you can almost always find the answer online.

Compare that to my first C programming experience on the Atari ST, where I spent two weeks trying to work out why a library function in the compiler didn't work, only to get a letter - on paper - from the compiler developers saying 'Er, we haven't done that part yet.'


Nobody's arguing that it isn't easier today for the professional, but that doesn't really lower the barrier of entry for the barely interested. Never mind that we simply orders of magnitude more people with computers now, so that even with a drastically lower percentage of people actually starting some kind of dev environment we'll just have to come out on top.

Yes, it's easier to create the good stuff (well, at least web stuff and mobile apps…), but I'd doubt that it's as easy to create silly crap that you're inordinately proud of. That may be nothing more than a simple question/answer loop, but even with that you created something and you understand some basics of how a program operates…

"Why Johnny can't read" was about literacy, not everyone going out to be a journalist or Hemingway. I view Brin's piece in the same light.

But yeah, we really shouldn't get lost too much in nostalgia. Things aren't that bad and projects like the Raspberry Pi show that this subject is tackled today. For me, the lesson to learn from BASIC environments aren't about forcing people to relive that nowadays, but that the programs that resulted out of the very first days with such a computer are interesting for educational purposes.


more than the raspberry pi, i am optimistic about projects like jsfiddle, where kids can make cool things and show them off to their friends.


Your statement: that more code is being written now is mostly due to the fact that there are many more machines. The population of people that have access to computers is much larger now.

If you go back to the 80s, I would wager that the percentage of people that coded (or even dabbled in code) was higher than it is now.


Even the IBM PC and XT had BASIC in ROM, which worked much the same way; and it also came with a complete set of schematics and the source code of the BIOS (I don't think the source for ROM BASIC was included, since it was licensed from Microsoft.) While it wasn't truly "open source" in the legal sense and was still copyrighted, it gave users a chance to learn how their machines worked all the way down to the hardware level, and along with that came the feeling that they actually owned the hardware they bought. DOS also came with a version of BASIC and DEBUG, a simple debugger that allowed the creation, editing, and testing of Asm programs. I remember the magazines at the time would have source code listings of simple programs for readers to type in and use. This isn't really "programming" in the sense of "write your own code", but it definitely provided a "catch-point" - some of the more curious might modify the program and see its effects, or do more research about e.g. the instructions and BIOS calls it used, and eventually start writing their own.

Several decades later, you're lucky if you can find even a detailed datasheet or programming manual for the most important chips in a computer. BIOSes are all closed and proprietary, with the exception of minority projects like Coreboot. The relatively few schematics for commercial PCs only exist because someone was nice and neighbourly enough to leak them. I think the gradual shift towards consumer-oriented is part of it, but security also had a chilling effect: belief in "security through obscurity" and the idea that users shouldn't be developers has lead to a situation in which access to development tools and information are seemingly treated as a privilege instead of a right, and systems are correspondingly locked down against users (but they'll all say this was to prevent "malicious attackers"...)

The walled gardens of Apple's iDevices, Microsoft's position on Secure Boot/Trusted Computing, and increasing prevalence of other schemes like DRM designed to take control away from users and strongly push a consumer-oriented mentality are a great evidence of this effect. More subtly, dumbed-down software designed to be "easy to use" take away much of the incentive to learn about how things work that is often responsible for transforming consumers into producers. No doubt the companies like this because they want to be in control and regulate the creation of software; as me and others have said before, "knowledge is power, and they don't want the users to have too much of it." However, I don't think they're ultimately going to benefit from this practice, since by encouraging users in the direction of consumption, they'll be reducing the number of potential good developers in the future.

From that article you linked to:

A kid puts her hand up in my lesson. ‘My computer won’t switch on,’ she says, with the air of desperation that implies she’s tried every conceivable way of making the thing work. I reach forward and switch on the monitor, and the screen flickers to life, displaying the Windows login screen. She can’t use a computer.

Having done some work helping with teaching before - in a computer science course - the number of times I've seen this happen is astounding. A large number of the population seem to have this condition where it appears their brain completely shuts down the moment they're put in front of a monitor, and I think a large part of it has to do with the notion that computers are somehow "magical" and "mysterious" things that don't follow the same rules of the universe as everything else.


... I think a large part of it has to do with the notion that computers are somehow "magical" and "mysterious" things that don't follow the same rules of the universe as everything else.

Excellent point. Computers are indeed like bubbles of custom universes that don't follow the rules of this one. The web doesn't have a real-world metaphor.

How do those universes and their foundational laws get created? By programming, of course. We all know that the stuff computers do is just built up from simple stuff that's been meticulously assembled into layers of increasing complexity: electrons on tiny wires, ands and registers, ifs and structs, packets and sockets, requests and threads, etc. But that knowledge is not discoverable by using a computer today.

It's no wonder that children don't know how to deal with failures that occur in the real-world interface of these custom bubble universes, when all they're thought about that interface amounts to magic gestures that trigger actions within the bubble. When a magic wand stops working, how do you fix it?


Regarding machines and discoverability: I think an interesting comparison to flesh out would be computers vs. cars. Compare how tinkerers in each came to be (opening up their parents' car/computer, for example). Compare how "the average user" treats the thing when it breaks. Etc.


This made me wonder when PEEK and POKE were no longer available. And I ran across PC magazine vol 1, issue 1: http://books.google.com/books?id=w_OhaFDePS4C&printsec=front...

Lots of good stuff in here. Page 52 has some details on the new "escape" key.


> the idea that users shouldn't be developers has lead to a situation in which access to development tools and information are seemingly treated as a privilege instead of a right

Unfortunately, a lot of people have been starting the War On General Purpose Computing[1] for a while now.

While many people saw Doctorow's first talk[1] on the War On General Purpose computing, his sequel talk[2] about the upcoming Civil-War On General Purpose Computing is one of the most important talks I've ever seen.

The talk is about exactly this point: Turing completeness scares various traditional powers (and some new/upcoming powers), and their instinct is to lash out and try to mandate that someone make a computer that can't run the various programs that scare them. Not understanding what "Turing Complete" means, they end up in a cycle that tries to blacklist anything that allows these scary programs to be written and used. Hiding the dev tools is just one step on a path that - by definition - must end up with the hobbling of computing devices until they are no longer Turing complete.

This problem needs to be fought right now, in a big way. The non-technical population is just starting to understand the full impact of what it means to have a General Purpose Computer, and we are only seeing the first skirmishes. The 2nd talk I mentioned above by Doctorow[2] - which I very strongly recommend that everybody watch - points out how modern concerns over "dev tools" (or "copyright") are trivial. We need to start the fight now, because it will be a lot more important when the War moves into areas such as medical devices or self-driving cars. We are already seeing the War start to affect areas like property rights and "first sale"; just wait until someone gets sued for publishing a fixed firmware for some company's buggy medical device.

Teaching the next generation from an early age about computers is an important part of the fight to preserve General Purpose Computers, and by promoting "learning by exploring" so kids of the future can learn like we did on older devices would help a lot, at least in the long-run.

[1] https://www.youtube.com/watch?v=HUEvRyemKSg

[2] https://www.youtube.com/watch?v=nypRYpVKc5Y


When I get into a funk thinking about the above points, I cheer myself up thinking that the % of people who used the 8-bits are probably a somewhat self-selected group, and that all the today's programming-unfriendly UX did is extend the computer use to the rest. But the original cohort will program today just as they did way back when.

And furthermore, today we have the Internet and its mind boggling amount of information. Did I like learning on my Atari 800 XL that came with a programming manual and booted up in 1s? Absolutely. Would I have killed to have access to all the hardware register information, instruction set manuals etc. that I couldn't get at the time without--in my country--being politically connected? You bet ;-)


Yes, it is indeed a sad state of affairs.

Instead of Dynabooks, we got interactive TVs.


Absolutely. It invited it to the point that for many of the 8-bit computers, to load a game you'd typically execute one or more BASIC commands that were no different in principle from any other.

On the C64 you might do "just" LOAD and RUN for a game on tape. Or you might do LOAD "$",8 (load the special directory 'file' from device 8 - the first floppy drive); LIST (to see the contents of the floppy), and then LOAD "somefile",8 / RUN to load and run your chosen program.

You were not just invited to play with BASIC. You were forced to at least acknowledge it and learn a few commands to do anything. So even if the command prompt did not intrigue you, perhaps the few BASIC commands you needed to load your games did.

And if you did a "LIST" on those games or programs you loaded you might be surprised by a full or partial BASIC listing (noteworthy examples: Sid Meier's "Pirates!" for the C64 was a mix of BASIC and machine code; though still a nightmare to modify for the curious as the machine code extended down into the BASIC memory area, which meant that modifying the tokenized length of any lines would case part of the machine code to get moved - or overwritten - and everything would break), or you might see the mysterious, to a beginner "SYS some-address" indicating a jump to a machine code. My first forays into assembly programming came after spending lots of time seeking information to figure out that curious thing.

Couple that with magazines with type-in program listings, in between game reviews and other stuff, and with manuals that pretty much started with the assumption you wanted to learn to program. I don't think people who grew up after this era really understand just how impossible it was for computer users to at even regularly have source code trust into our faces, even for those not seeking it out.

For example the VIC 20 manual that can be found at [1] (warning: large PDF), or the C64 manual at [2]. The VIC-20 manual starts with:

"You are about to meet a friendly computer! Friendly in price, friendly in size, friendly to use and learn on and experience. Most important - you don't have to be a computer programmer, or even a typist, to use it!"

.. and goes on to explain how easy it is to learn programming, for anyone, and what chapter to turn to to learn various aspects. First halfway through the preface, it starts to address users that don't want to program...

[1] http://www.classiccmp.org/cini/pdf/Commodore/VIC-20%20User's... [2] http://www.commodore.ca/commodore-manuals/commodore-64-users...


You can open a web browser on a computer these days and plug away at HTML and Javascript and do some really neat things. It's the BASIC of this generation.


But it's not in your face the same way. Most people I know likely have no idea that they can view the source of a web page, or what the "source" of a web page might mean. There are far fewer "bread crumbs" leading people towards activities that expose them to the plumbing underneath the services they consume.


Earlier the barrier was the high price of a computer.

Today the barrier is that you have to specialize you more.

But on the other side you have more tools, more ressources (the internet), more help, more tutorials, ....


That is right, but you have to read at least 5-10 tutorials, before you can start to code ....

Or you are lucky and start at the right place (e.g. you download Python and start with a Python tutorial) ... but when you started with Windows, you might be out of luck, if you did not find some cheap way to access one of the powerful IDEs in existence (many are still very expensive or/and need that you read the same amount of tuts).


and for kids it is much easier to go off and watch youtube after getting bored while looking for tutorials. I guess the charm and mysticism of machines are also gone to the children of today.


I've still got all my machines from that era. They still work, they're still highly entertaining, and very, very useful. My kids (6 and 4 yrs) are learning to read, write, spell .. and do math .. with the same machines I used when I was 13.

This just points out, to me, how arbitrary technology really is. All the energy into building that C64 is wasted if the thing ends up on the trash heap .. but dust it off today and someone, somewhere, will still find a use for it.

    "Where did the IDE go wrong?"
I think where things went wrong is the disassociation of 'developer' from 'user' that happened as a consequence of marketing-grads getting involved in the business of computers. I've never considered an OS truly 'user friendly' if it doesn't ship with everything on board that a person would need to build applications for it - and that is something the BASIC guys did well, back in the day.

(Which is why I think that things like LOAD81 are so darn cool .. ;) http://github.com/antirez/load81)


> I think where things went wrong is the disassociation of 'developer' from 'user' that happened as a consequence of marketing-grads getting involved in the business of computers

Somehow this happened to happen at the same time when computer users rose from 1% of populace to vast majority. We probably ended up with more "developer" guys this way.


Its difficult to really say, but you may be 'right' in the sense that the more people you throw at something, the greater the variety of skill set you have to accommodate. In the early days, "only nerds used computers" - and now look at us. Still, I blame the tools vendors for not making it viable to include developer-style applications as a basic default built-in - certainly not true of most OS's, except 'those built for consumers'. Imagine if we'd had the temerity, we nerds, to demand that the dev tools be treated as 1st-class applications in the OS/execution environment? I suppose we'd all be using Lisp machines, eh? :P


To me a huge loss from that time is that you could still completely understand what your computer was doing and what the software running on it was. Bloat has solidly killed that possibility, you could not even understand all the code on your phone these days if you wanted to, let alone your desktop machine.


"you could still completely understand what your computer was doing and what the software running on it was"

I can see a potential return to those days for people who want to be really sure that their systems are secure.


I recommend you keep a close eye on the work of the VPRI:

http://www.vpri.org/pdf/tr2011004_steps11.pdf


Nowadays a phone and a desktop computer are at about the same level of complexity in terms of code.


Sorry, got any sources to back up that claim? Smartphones are indeed complex but I highly doubt Windows phone has the same LOC as Windows 8 (despite them sharing the same core), same for iOS -> OSX


The download is certainly bigger for OSX Mavericks (5.3gb) vs iOS7 (750mb). OSX seems a lot more open in terms of understanding what's going on though.


As a ZX Spectrum veteran the only things I really missed - in the sense that they made my bigger programs unwieldy - were precisely those cited in the first paragraph, viz calling subroutines by name rather than line number, and parameter passing. Oh yes and an Else statement. Of course those snooty BBC Micro kids had all that IIRC.

The Spectrum community was fantastic in those days - in addition to a plethora of magazines there was the "ZX Spectrum ROM disassembly" (which I still possess) that gave an annotated listing of the whole 16K ROM; basic interpreter, fp calculator, cassette tape routines, the lot; an absolute goldmine.

So an entire ecosystem that basically screamed "program me!". A beautiful time.


I know it's not exactly the same, but, open the Javascript developer console in a browser, and you have a somewhat similar capabilities at your fingertips. You can alter your whole environment if you consider the particular webpage you're visiting to be your whole environment. You don't need to do much more special for "cos" there either, just type Math.cos(2) / 2.

And if you're using Linux, it's easy to be welcomed by the bash prompt of course (and I hear Mac uses a Unix terminal too now these days).


You can easily have this on a Linux machine:

First install basic: apt-get install bwbasic

Next find where getty starts the login program, but change it to run basic instead. In Ubuntu: /etc/init/ttyS0.conf:

   start on stopped rc RUNLEVEL=[2345]
   stop on runlevel [!2345]
   respawn
   exec /sbin/getty -8 -n -l /usr/bin/bwbasic -L 115200 ttyS0 vt102
You will see this on the serial port:

   Bywater BASIC Interpreter/Shell, version 2.20 patch level 2
   Copyright (c) 1993, Ted A. Campbell
   Copyright (c) 1995-1997, Jon B. Volkoff
 

   ERROR: Failed to open file --
   bwBASIC: 
   bwBASIC: print "Hello, world!"
   Hello, world!
   bwBASIC: 10 for a = 1 to 10
   bwBASIC: 20 print "Hello ", a
   bwBASIC: 30 next a
   bwBASIC: run
   Hello         1
   Hello         2
   Hello         3
   Hello         4
   Hello         5
   Hello         6
   Hello         7
   Hello         8
   Hello         9
   Hello         10
   bwBASIC: 
bwBASIC is a shell.. so you can type "ls".. or "exec emacs"..


That's pretty similar to what the OP covered. Except part of the attraction was NOT having to install or configure special tools. Remember having BASIC always-on was what made it so accessible.

I get it, after you do this you can just log in and voila Basic. Still until Linux comes with this as a special user login, there's a gap in the experience.


Agreed. There was something universal, almost magical, about ROM BASIC.

For me it was Commodore BASIC on the C64. I turn the machine on, and instantly there's a little flashing prompt inviting me to give it some input.

Having to enter commands here to LOAD from tape was what initially sparked my interest in programming. I'm telling the machine what to do, and it's doing it right away. Wow.

There was no barrier to entry. And despite valiant efforts these days, it was so much better than the modern "making programming accessible" ideas. I didn't have to install anything. Every C64 I would encounter had this functionality. It didn't feel like a dumbed down second-class-citizen environment. This was how you interacted with the machine.

I think that main-streaming of general purpose computers has killed this. I can't imagine booting my primary machine to a BASIC prompt now (or even a shell, I guess). It would make listening to music, watching videos, checking my bank balance or accessing my virtual machines for work.

I don't have separate media devices. I can't just throw on my headphones and listen to my walkman while my computer is busy with something.

Although modern computing has unlocked a lot of potential, I think we've made too many compromises. Outside of industrial and scientific applications, most computing devices try to be a jack of all trades, being everything to everyone, and doing poorly at it. Like so many contemporary software engineers (myself included - I mean, the modern tools and frameworks are too large and diverse know as intimately as we used to know computers).

Sigh.


We have to cut ourselves a bit of slack here. Sure, 8-bit computers came up in a BASIC prompt... but they could default to a BASIC prompt because they had nothing else to do. Modern computers do. We do need to at least spot ourselves having to tell the computer to bring up this environment rather than something else... after all, if there are so much as two wonderful learning environments we need to be able to pick between them.

If you insist, you can build a kid-friendly distro of Linux, and then build custom computers to put it on which come up straight into this environment... but given that that's been tried before, I'm not terribly optimistic about that shutting down the continuous stream of complaints of this nature.

And I guess that I know this is a popular opinion amongst a certain segment of the programming population, but the scientist in me can't help but note the large number of times this has been "fixed", yet, literally years and decades after the fixes, the complaints are still flowing, virtually unchanged. Personally I think the most likely explanation is simply that this is an incorrect diagnosis of the problem. Tempting, easy, seductive, but incorrect.

Further evidence for the incorrectness: Despite the way that many programmers got their start in this environment, I'd point out that far, far more people were exposed to this environment and still did not become coders, or worse, decided that computers weren't for them. I don't see any particularly concrete evidence that leads me to believe this was that special of an environment, once I discard nostalgia. At the very least we'd need to show some sort of difference between two environments without the confounding factor of a dozen orders of magnitude difference in performance, to say nothing of the other advancements made since then.


You don't even have to log in: it just comes up when you boot the system. I think if you install the server version of Ubuntu, no GUI automatically starts so you could run this on tty1 (first VGA console).

Anyway, I understand the OP's point. You are right that a new computer does not come up this way right out of the box. I could see setting up a bunch of Linux machines like this, to recreate something like the C64 school computer labs of the past.


Maybe this is an opening for someone to build some Linux distros in the flavour of old micro computers...


Hmm.... the JavaScript console in modern browsers like Chrome and Firefox is probably the closest thing we have to the old 8 bit BASICs. It's always on, and available at the click of a mouse.

It's not as directly "in your face" as the BASIC was, of course, but it's there.


I bought a mint C64 off of ebay last year, included was it's original Programmer's Reference. It immediately had me missing the afternoons where the limited functionality and simplicity of it all invited you to peek and poke your way around the system trying to coax music and art out of the hardware. Even people who weren't 'savvy' understood this, ex: in the 6th grade our librarian gave us simple programs to type in and encouraged us to change it.


http://cl.ly/image/3Z2I3r051J0O <- Simple Memory Map of the C64, those were the days!


> ex: in the 6th grade our librarian gave us simple programs to type in and encouraged us to change it.

Of course one great thing is that C64 was pretty much immutable and thus 6th-grader-proof. No matter what sort of state you got it in, a reboot and all was well again.


I think the article misses the point a little bit. The IDE facets that the author fondly remembers isn't part of the BASIC programming language, it's part of the command shell. It's loosely akin to your terminal emulator running Bash. By default it works in real time but you can write more complicated routines programmatically and then run them at your convenience; and you can do so from the shell prompt (either via aliases, shell functions or just echo'ing to a shell script in a similar fashion as his line numbered example).

Plus a lot of his complaints seem to be about the modularisation of modern languages - which seem an odd complaint to make in my opinion. If anything, I'd personally argue that things like importable, self-contained, chunks of code is one of the single greatest advances.

He definitely has a point that the barrier for entry these days is much higher (and this is probably why so many kids these days fall into web development over native applications) but I think the examples he's used don't justify the conclusion he's trying to draw. And neither do I agree that regressing to a BASIC-like environment would fix the problem. I think the problem is simply expectation - people expect so much more that there often isn't the patience to start with the basics. Plus the "code me!" vs the "consume me!" mindset raised by chx[1] erodes what little patience some might have.

That's my 2c worth anyway

[1] https://news.ycombinator.com/item?id=8256211


BASIC and its command shell were in an 8KB ROM. They were one monolithic program. You've got modern ideas, and trying to apply them to a 20-year-old environment. Its not 'like' a lot of things; it came before them so the most you could say is, those things are like BASIC.

What BASIC was, was an extremely accessible try-it-now environment that needed no installation, no setup, no environment variables, no directory structure. It was what we thought of when we thought of interpreters, as opposed to compilers. But then interpreters got all file-oriented and broken too.

SO there have been a lot of improvements since then. But some of what we lost was very, very different. And some of it was valuable in a way.


> BASIC and its command shell were in an 8KB ROM. They were one monolithic program. You've got modern ideas, and trying to apply them to a 20-year-old environment. Its not 'like' a lot of things; it came before them so the most you could say is, those things are like BASIC.

Except what you've described is exactly what I described with Bash ;)

> What BASIC was, was an extremely accessible try-it-now environment that needed no installation, no setup, no environment variables, no directory structure. It was what we thought of when we thought of interpreters, as opposed to compilers. But then interpreters got all file-oriented and broken too.

Again, all that is comparable with Bash:

1. Bash doesn't need any installation with most UNIX-like systems. But even in the rare example that Bash isn't bundled with your OS, there's the Bourne Shell (sh) or even Windows cmd.exe (for all it's faults).

2. Bash et al doesn't require any set up.

3. Environmental variables are just global OS variables and since BASIC wasn't namespaced, all variables were effectively environmental variables.

4. Directories are a file system thing, not a language thing. However it's worth noting that many BASIC systems did have a directory structure even if though (at that point) nested subdirectories didn't really exist. You could have different storage devices that were switchable (akin to changing drive letter in DOS) and you could store multiple files on an particular medium. In fact I still have a stack of floppy disks for an Amstrad CPC 464 (which ran Locamotic BASIC) in my attic that would testify to this.

5. Bash et al are interpreters.

Don't get me wrong, I romanises about the old days too. They were fun. But I don't think you can blanket say "[it's a] 20-year-old environment. Its not 'like' a lot of things" just to dismiss any arguments you dislike.


I only disliked the wrong arguments :) Like

"The IDE facets that the author fondly remembers isn't part of the BASIC programming language, it's part of the command shell"

There was no such distinction. That's recasting of the facts into something understandandable today.

Know why the language keywords were so short? Because in an 8K ROM the symbol table was a significant hit on resources. Add a keyword? Means remove some other feature. It was a whole different world, with different constraints. And still it was usable, accessible, friendly even. At least when coming from nothing to a computer (instead of coming from 20 years of growth and confusing hindsight with foresight).


> There was no such distinction. That's recasting of the facts into something understandandable today.

But that's the whole bloody point of the article. If you have an issue with that then take it up with the author rather than me.

Please also remember that the Bourne Shell is as old as BASIC micro computers. So my comparisons are of two environments of the same age rather than older systems vs modern systems (like you keep accusing me of).

And around the same time some micro computers (even the lower end ones) would support other interpreters (eg the BBC Micro supported BBC Basic, LISP, LOGO, Fortran, and a few others) and you could switch between languages like you switch shells in Linux. Which is also why I like to make the distinction between the language and the shell.

> Know why the language keywords were so short? Because in an 8K ROM the symbol table was a significant hit on resources. Add a keyword? Means remove some other feature. It was a whole different world, with different constraints. And still it was usable, accessible, friendly even.

I know - I was there. And while it's interesting, it's also irrelevant to any of mine or the authors points.


[deleted]


Good point! Minicomputers were around before micros. So BASIC wasn't born in a vacuum. Lots of precedent.

What's relevant is, modern ideas of encapsulation, object-oriented design or even structured design were not available to the inventors of early language ROMs. Nor did they have the room for such luxuries. They wrote one blob of code, probably in one large .asm file, that got the job done.


They were available as Fortran, LISP, C and Pascal were all around back then. But you're right about the necessity code the ROMs in Assembly .


Plus a lot of his complaints seem to be about the modularisation of modern languages - which seem an odd complaint to make in my opinion. If anything, I'd personally argue that things like importable, self-contained, chunks of code is one of the single greatest advances.

Yeah, his respect for BASIC seems misplaced at best. Take this excerpt:

There's a small detail that I skipped over: entering a multi-line program on a computer in a department store. Without starting an external editor. Without creating a file to be later loaded into the BASIC interpreter

So what? Any language with a REPL (and nowadays, that's pretty much all of them) can do that. And sure, computers don't boot up into a REPL, but a) do you really want them to? and b) as the top comment here shows, you can easily set up a computer to boot straight to a REPL.


On those early consumer 8-bits, the command shell WAS the BASIC interpreter. You turned the computer on and you were right there in the BASIC environment! You couldn't do anything without typing in BASIC commands.

Your other points are OK, but it really wasn't like a Bash shell. An interactive Python command-line is closer, but it's still a stretch.


A Bash shell is an command line interpreter. Bash is Turing complete and even supports more advanced programming paradigms than those early BASIC machines did (eg functions, different scoping, more advanced debugging and error handling, forking, etc). This is all just built in commands - I'm not including any other POSIX user land. In the right hands, Bash is every bit as much a REPL environment as an interactive Python or LISP shell.

I don't understand why you're discounting Python here though. Would you mind explaining to me why a REPL BASIC shell is different from a REPL Python shell aside the 20 years age gap between them?

One last thing, as I noted in an earlier comment, most of those BASIC micro computers did also support other languages and shells. The later models of the BBC Micro came with LOGO in addition to BBC BASIC and also supported Fortran, LISP and a few other languages. My Amstrad CPC 464 runs Locomotive BASIC but I also have a CP/M disk for it. And around the same time (and a even a few years earlier) there were LISP machines and other computers which booted into other language shells.

BASIC wasn't unique nor special in the regard that you're praising it for. Neither back then then nor now. What made BASIC micro computers special was how simple the language BASIC was (otherwise we'd all be looking back fondly of our Fortran or LISP machines!). And this is why I make the distinction between the shell and the language. Because there's always been and always will be a large array of similar REPL shells - the key distinction between them being the accessibility of the language.


I considered saying the same thing about bash, but its really missing a big portion of what made BASIC exciting on those machines. That is the ability to draw graphics. Some of the other scripting languages with Tk linking might count though.

Frankly, I'm convinced javascript is the modern equivalent to BASIC. It is even complete with some horrible brain rotting syntax. But, it does provide both an easy barrier to entry (everyone has a web browser), the ability to show your results to others, and the fact that it remains in source form allows beginners to examine how something was achieved.


It depended upon the BASIC version, but the one you got on an IBM PC (the one embedded in the BIOS, usually only on PCs from IBM) was an odd cross between an editor and a REPL. Start out:

    PRINT "HELLO WORLD"
Press ENTER, and you get "HELLO WORLD" on the next line. Okay. Move the cursor up to the PRINT, hit the Insert key, type "10" then ENTER. You have now just entered a line into your program. Type LIST and you see

    10 PRINT "HELLO WORLD"
Move the cursor back up to line 10. Change the "10" to "20" and change "HELLO WORLD" to "IT IS GREAT TO BE HERE" and press ENTER. Reposition the cursor over the LIST command and press ENTER:

    10 PRINT "HELLO WORLD"
    20 PRINT "IT IS GREAT TO BE HERE"
Moving the cursor to the top of the screen wouldn't cause the display to scroll up, but that was more due to constrained resources than anything else. But other than that, the entire screen was the editor. Type in an expression to test it out. Go back up and fix it. Once it's fine, slap a number in front to add it to the program. As a REPL, it was vastly different than anything REPL I've seen since.


You can do that with Bash (albeit not with line numbers).

But as I said before, there were other languages available for said machines and those had their editors as well. What you're describing isn't a unique attribute to BASIC.


I remember when I used my ZX Spectrum to study multiplication tables with a program that I made on Basic that show me the tables and then ask me random multiplications of a table. Funny times...


This has something of the Smalltalk image-based approach (and the author might enjoy using a Smalltalk development environment). IMO the costs outweigh the gains - it's worth decoupling programming a computer from using it. When source code is just text files, you can manipulate it with lots of powerful tools; even better, you can use tools from different languages whose authors never talked to each other. When the shell and the compiler are just user-mode programs, they can iterate much faster, and you can choose one that suits your style. You can use the same program for both, if you really want to - for a few weeks I used tclsh as my login shell - but it turns out the tools you need for programming are quite different from those for general computer use. Division of labour is ultimately a good thing, even if it means less of the population has any specific skillset.


No commentary on the results of a quick google search of "tryruby" "trypython" "tryclojure" pretty much try* where * is a currently popular language. Or even not so cool, I found a "try brainfck" at

http://www.compileonline.com/execute_brainfk_online.php

I'm specifically pulling lists of "try" services rather than the much heavier "I am an IDE in your web browser" services which might be somewhat overwhelming as a first introduction to programming...

Take your web browser appliance, make "http://tryclj.com/" your web browser appliance's home page, all done.


I kind of miss BASIC. One of the good things in those days was that printing a few lines or drawing a graph was about the limits of what the computer could do so it seemed cool. You can now run BASIC in a javascript interpreter in your browser (http://www.calormen.com/jsbasic/) but it's not cool anymore. I'd guess the nearest equivalent of something simple to learn that you can impress your mates with these days would be javascript - then you can make apps, funny effects on webpages and BASIC interpreters if you get good at it.


I just habe finished reading Petzold's Code after having it in my reading list for several years now. I expected something along the same lines as Code Complete and got something entirely different. And boy, I was in for a ride. I read it in two days straight. I think this book can offer an answer for us who were too late for the type of computer mentioned in the linked article. Has any of you experience in giving this book to people who don't have much to do with technology or even children?


> I think this book can offer an answer for us who were too late for the type of computer mentioned in the linked article

What is the question for the answer that the book gives?


I think the best thing anyone can do to encourage people to learn programming (with what we currently have) would be for Google to include a RAD IDE with a simple interpreted programming language like Python with the OS itself. Something standard, across one of the currently most popular platforms. When first launched it should give you a programming manual.

But I can imagine them ruining it. It would be tied to their cloud service (oh my tablet's offline? Half of the features won't work, the manual is gone and I can't share my code). It would be updated all the time so wouldn't be stable. The runtime and IDE would be so fragmented (one of the benefits of ROM was that it was expensive to burn so tended to be quite stable over a long period). And of course there would be the AOSP version and the Google version furthering fragmentation.

Modern computing has turned me into a cynic. I'm slowly starting to hate what our industry has become. By the time I was old enough to start my professional career, the world I had fallen in love with was gone.


Not exactly what you mean, but...

    Ctrl+Shift+J
    
    console.log("Hello World");


As a kid, I first learned how to program in an 8-bit BASIC environment. This proved to be a mixed blessing.

On the plus side, it was every bit as approachable as this post makes it sound. Not just because of its "always-on" nature, but because of the relatively small learning surface it presented: line numbers, GOTO, a few operators. It was comprehensible in a way that more complex languages weren't.

On the minus side, soaking my young mind in the paradigm of structuring program flow around line numbers bent it in ways that didn't become apparent until I got a little older and tried to graduate to languages like Pascal and C. I struggled to get comfortable in these environments in ways that other peers with less programming experience did not, precisely because I had internalized so much of BASIC's skewed way of thinking about program structure. It took a fair bit of time to un-learn the bad habits all that BASIC programming had taught me.


I learned my ropes with QBASIC. It's as approachable as good ol basic but it had text labels instead of line numbers.

Thus for me the biggest step in moving to C when I was 12 or so was string processing and the concept of having to compile it and the weird include files. And why do I have to write -lm to djgpp so I can use cos? Those were the days.


David Brin and others have lamented the lack of an ubiquitous, interpreted, always-there programming environment to help get kids started down the path to software development. I tend to agree that it is a lot harder to penetrate the first layer than it used to be. So, from someone who also started with ROM BASIC, thanks for the memories.


Montfort et al described the same thing in the book "10 print chr$(205.5+rnd(1));:goto 10" [0]

[0] http://trope-tank.mit.edu/10_PRINT_121114.pdf


What this article boils down to is two things: 1) Computers used to have a built-in language that would be immediately available, and 2) that language had a REPL.

If you wanted to get the same experience in any REPL without line numbers (and with Lambdas), you could say:

    mainprog = lambda{some lines of code};
    exec mainprog;
instead of:

    10 line 1 of code
    20 line 2 of code
    RUN
In the case of BASIC, prefixing a line number is a shortcut for saying "variable LINE-XX = lambda{some code}". Now we just need to get computers to come up with a standardized REPL readily available at boot time. Maybe if web browsers started to default to having a Javascript console prominently displayed at all times.


If you wanted to recapture that feel of an instant-on console, you could set up a computer that booted into lighttable. It supports mouse and graphics and music. But to get a web browser up, you call a function.


The author mentions jokers breaking out of his Atari demo loop written using BASIC.

I first learned BASIC on the Radio Shack TRS-80 Model 1. As a child, there was a favorite jokester demo loop involving Qbasic with Windows. I'd visit a department store, break out of each store computer's demo loop to DOS, and then type up a simple Qbasic loop with random SOUND calls where I had painstakingly tested and memorized random frequency ranges and delays to simulate the sound of water running.

Returning these computers to their store demo loop left a sea of scratched heads in my wake.


I started with BASIC when I was 6. That thing had built-in calculator with parenthesis(and calculators were not that common in the early 80s). So it was totally fascinating. Of course I never realized there could be anything but global variables and the GOTO was considered bad style.

It was possible to code a small program that draws on the screen (with ijkm [ijkl])in several minutes. Fun times indeed, but I am not sure it's applicable to the young kids any more.


How did IDEs go so wrong?

I'd say it's a retorical question, but the IDEs part makes me doubt. Programming is no more inmediate, graphics are difficult. All that.


The best thing about line oriented languages like BASIC was that, lacking abstract functions, the language worked exactly like the machine worked. If you wanted functions and parameters, you had to implement that much like you'd do if you were writing assembly.

Learning BASIC with line numbers and GOTO meant learning how the hardware control-flow worked. This is abstracted away in all languages today.


If you are on a mac, then you can really do all this with Apple Script, which also is a bit small talk alike, and that by Maverick, can let you create libraries quite easy.

I use it a lot as a calculator, when I don't use it for autmating the UI. It is great, IMO, the best thing about a Mac.


You can do it with environments provided with most OS's. The big difference is that it is not put in front of people to the same extent that makes it trivially discoverable, and available from the second you press the on switch in the case of most of the 8-bits.

AppleScript is a particular peeve of mine. I was a long time Amiga user. I was used to "everything" being glued together via AREXX ports. OS X have the capability, yet the environment is very different. It is far more rare (presumably not Apple's or developers fault per se - more reflecting a different demographics) to see OS X applications manuals call out script integration as a major feature, or "hang" all its own automation off of it, even for the ones with good script integration (and though I'm a Linux user at home, this is a sore point: Linux script integration is woeful in comparison)


I think the important message here is that simple programming has long been the niche of hobbyists and entrepreneurs, it's about time that we cornered this market and commoditized it. It should be really easy since it's small and unimportant.


Dont forget that the first IBM PC (8086, 16 bit) also had rom basic, as well as a cassette port.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: