Thank you, Ray. Any old compilers in your time machine as well? ;-) I had (incorrectly) presumed that this older K&R style of C predated Windows and was never ported from Unix.
Until a few years ago you could still find K&R declarations in the Ruby source code[1]. Wasn't until 1.9 that they switched things over to ANSI style, iirc.
So, believe it or not, you might still find it out there in the wild.
FreeBSD still uses this declaration style liberally (keep in mind FreeBSD is a fork of BSD which has been around for some decades, and it makes sense that there's some old code floating around in the kernel)
Any chance of uploading 1:1 floppy images of those? Been looking for Windows Premiere Edition for a while, and now parts of its SDK (and a Win 1.0 beta's!) shows up, and from Ray Ozzie too!
Under Windows it is easy to create a console application, which is similar to the usual C helloworld. But the application here creates a windows to show hello world - so it's fair to compare it with a similar program on GNU/Linux/Xlib.
EDIT: Yes, it is much easier to use something as Qt on GNU/Linux, but there are also abstractions around the WinAPI available, so using Qt on GNU/Linux for a helloworld example yields no fair comparison.
My Master's thesis in electronic engineering was supposed to be based on a previous work written by a maths phd in c++ on XWindows. I'd only used C and Basic before and not done any GUI programming. I couldn't believe how much code was involved to set up a few simple windows. It really put me off and I pretty much gave up and started writing my own [image manipulation] code in C. I wish I'd had the sort of resources you can get for free nowadays available it might have been a fun project!
It was the mid 90s and I didn't have unix/linux at home. In those days you had to pay for a compiler on windows and I didn't know anyone running linux. Just a couple of years later it was so much easier!
A story. As Petzold writes, the Windows version of "hello world" was amazingly baroque. It bothered me, a lot. I had come up in the four line hello.c world of UNIX, and later the make/test world of UNIX, and when I had left Sun the idea of writing code like the Hello World example was something I could not bring myself to do. I rebelled in a bunch of different ways, but the worst part was it started a period for me where I programmed a lot less, I was getting curmudgeonly because programming had become so screwed up.
The thing that changed for me was getting a chance to help a young man with his programming who had come up through Windows rather than something easier, and he was full of all the wonder and excitement I had had at his age for programming, and had no idea that he was running with a large handicap to his productivity. And partly because many of the "tools" you had to have like Visual Studio worked their butt off to make that gap small, from self writing code to documentation at your fingertips. I realized that if I didn't get the 'wonder' back of writing code I was going to become one of those guys who sits around grumbling about how in my day things were so much better and FORTRAN just worked dammit :-).
What I tried to learn from it was that systems with great generality suffered from usability efficiency. The reason UNIX hello.c was so simple is that UNIX programs, by and large, did simple things, algorithmic computation driven by interaction over a character based interface. One way to drive effecive use of a system was to either eliminate or otherwise hide the modalities that were "possible" but not probable in future execution of the program.
That has helped me see when I'm building something that is letting to much generality leak out to the surface.
"What I tried to learn from it was that systems with great generality suffered from usability efficiency."
I'm entering "curmudgeon" age myself, at least for our industry, and what I'm getting curmudgeonly about are my fellow curmudgeons, and that's part of why. Oh, programming was so much better in the 8-bit era? So go back there then. It's still available to you if you want it! Oh, you don't want to take actions to match your words? Why not? Because, in a nutshell, no it wasn't easier. It was easier to do the things it could do, sure, but it can't do much. So stop telling me about how wonderful the 8-bit era was.
That's just one example. There's several of these recurring rants that come up here on HN periodically.
Well said, in particular this bit, "Oh, you don't want to take actions to match your words?"
There is a downside though when you get a curmudgeon with a fondness for the "old" days and takes action by re-implementing your build system to look like something you would read about in an IBM JCL handbook :-) I'm sure 8 character error codes were easy to remember but even better is a short sentence on what the error was.
Certainly there is advantage in having a simple hello-world program. The original purpose of hello-world, naturally, is just to have something to verify that your environment and tools are properly installed and configured before you go off and do anything serious.
But because Petzold used "his" hello-world program to introduce you to a bunch of critical, foundational concepts -- to show you everything you needed in order to get a functioning top-level application window open on the desktop -- I myself found it a great place to start. By forcing you to confront the message loop from square one, it really reinforced the idea that you aren't in Kansas anymore (and you can't do things like the Kansans do).
As Petzold hints in his article, the original version of the book (for Windows 2.x) didn't even start with that program, but worked its way up to hello-world through a sequence of 5-10 iterations that got you to that point over the course of an entire chapter. (Thus for example one iteration actually opened a top-level window but lacked a message loop, so the program exited and the window immediately disappeared after it was created.) It makes sense that he had to edit out such a lackadaisical introductory chapter as the subject of Windows programming got bigger and bigger, but I for one always liked that chapter.
Brings back horrid memories of learning Windows programming back in the mid nineties. From the very first line of code, Microsoft had you doing it their (nonstandard) way, making you use WinMain() instead of main(). That style, too... Yuck! Even today, I can somewhat reliably identify folks who were raised on the Windows SDK by looking for lpszHungarianNotation and UNREADABLEALLCAPSTYPEDEFS.
Does anyone remember why the original samples allocated structures on the heap, did stuff with them, then immediately freed them, rather than just using them on the stack directly? Was that a thing? Were early Windows systems stack constrained or something?
I found it a lot easier to learn Win32 when I discovered that much of it was just MS' own styling and not actually necessary at all. A lot of the typedefs are artifacts from the Win16 days (e.g. you can just use a void* instead of LPVOID). This is a perfectly valid "Hello World" program, in a more traditional C style:
Depending on if you specify /subsystem:windows or not in the linker settings, you can even get a console window (and thus printf() to it, etc.) and use it for logging etc. alongside the GUI, something that isn't often mentioned.
Another thing that probably drives people away is the insistence on starting with a "whole window" app, one that uses CreateWindow/RegisterClass, handles WM_PAINT, etc. when for a lot of purposes, a "modal dialog-based" app using a template and DialogBoxParam would be sufficient, and writing one of those is certainly easier. I've made a lot of trivial GUIs of the "window with a set of buttons to do various things" type with Win32, and it definitely doesn't take all that much work. Nothing more than two functions, one main() and one wndproc(), switch on the message/control in the wndproc() depending on which button was clicked, and do something accordingly.
I think it's a bit of a shame that the majority of the documentation on it is both overly complex and nonstandard in style, because the true "essence" of Win32 is really a small and simple C API. It's easy to create really small programs with it, and it doesn't take much code to do that either. When I learned WinSock I wrote a simple netcat-like GUI terminal emulator and it was <10KB.
Were early Windows systems stack constrained or something?
Win16, certainly. All apps ran in a segmented address space and the stack was usually in the 4-8KB range.
My one development job involving Windows code was a cross-platform Windows / some-embedded-os-that-used-Win-32 / Linux app that ran an FTP server and SNMP server.
No GUI stuff, but all the differences between Win32 and Linux for the APIs I used were abstracted away with a few handfuls of typedef's and #define's.
Look at "Minimal dialog-based WIN32 application" - it has some code to do some other stuff but the basic skeleton is the same. Here is another example of the minimal code:
This may be a bit of an odd recommendation, but also look for tutorials on how to create keygens - the majority of those are dialog-based since the interaction fits the model perfectly - when a button is clicked, read one or more text fields, do some computation, write the result to another text field.
After I spent $59.99 on Petzold's "Programming Windows 95", I ended up just using it as a reference and instead went through TheForger's (from #winprog) tutorial: http://www.winprog.org/tutorial/
As I recall, there was a different entry point to distinguish DOS from Windows programs. A Windows program started in DOS would have a stub main that just printed out the "this is a Windows program" warning.
It's actually quite a reasonable segregation to do when you consider the different kinds of setup each type of program requires, not that there aren't other ways around it, of course.
That's a different thing, actually. There are two distinctions here:
- Win16/win32 programs in PE object files which could be:
-- Console subsystem, where they'd have a stdin and stdout and all that, take a char* argv, their entry point was main(), etc. and if run directly would open a console window, and if run from a console would block the console with their own execution.
-- Windows subsystem, where they would have the more elaborate entry point (WinMain), could get unicode argv, etc. If run directly they would not spawn a console, and if run from a command line would appear to return immediately.
- DOS EXE files, which could obviously only be 'console' and only had one entry point.
PE executables (of either sort) had a stub DOS exe at their header that would print that message.
As another post pointed out, nothing was actually stopping you from using the gui windows apis from a console app, incidentally. It wasn't done very often though.
I used to have much disgust for hungarian notation. When I worked on my first big C++ codebase though, hungarian notation was standard, and I have to say, it made the code more readable. using 's' for static, 'm' for member, etc brought some extra clarity as to what was happening. It's not that we couldn't determine these thing for ourselves (VS has great code-crawling facilities), but its less time hovering, symbol searching, and jumping-to-declaration.
I don't use it out of choice, but it isn't so bad. Though, this was using a slightly HN slightly differently. complete variable types in the name i have different feelings for!
I think you're mixing two absolutely different types of Hungarian Notation.
No offense, but what you describe seems more or less like the reviled version (stuffing ~redundant~ information in the name, where you end up with strName and iCounter).
The more-or-less accepted version described in the GP is trying to fix a shortcoming of the language's type system. If you cannot tell your compiler that this is a (default example) pixel value and this is a color, because both are plain ints to it, you're trying to make accidents less likely by requiring the programmer to double-check the prefixes (which encode a subtype, a special type. Not int, but int-representing-color or int-as-a-bool-here).
iCounter always makes me wonder why someone would be iterating through counters! I've still got the:
for iObject = 1:nObject
style from matlab coding firmly rooted in my head - when hungarian isn't even creating universally and easily understood/parsed names I just can't see the value.
There's a limited degree to which it's useful in resolving scope and type ambiguities in C or C++, but excessive use could definitely render things much less readable. It doesn't help that some of those letters survived long past their shelf life (ie. 'lp' for long pointer, where in non-segmented architectures there is no far (aka long) and near pointer distinction).
These days in C++ I'd only ever consider using s or g, and never the type specifiers. I also largely consider signalling member variable vs. local variable to be pointless in all its forms (including _ suffix or prefix) now.
"used in my example above where we decided that us meant “unsafe string” and s meant “safe string.” They’re both of type string. The compiler won’t help you if you assign one to the other"
Assuming you have the luxury of a language with a good type system (either because it's designed for the task in hand or it's extensible), the compiler can help you, and you would be much better off having unsafe and safe strings as separate types. Then the encode function simply becomes a function of type unsafe -> safe. I believe Michael Snoymann touches on this in his presentation, "Designing Type-Safe Haskell APIs"[1].
I'm not arguing that Joel's method isn't a good idea. However, if you can it's better to leave hints for the compiler, not just the programmers.
's' for static, 'm' for member is not a Hungarian notation. It's just an established prefix patterns. Some style guides use '_' as a suffix for class member variables, and g_ prefix for globals. It's just a consistency. Hungarian notation is a nightmarish replacement of the intellisense such as lpszFoo.
Out of that I use m_ for members. But that's about it. It is very helpful when you read lengthy methods code. For actual names I use snake notation. I.e. something like
Type in the name is overkill for int or char, but it's useful when you've got multiple levels of pointers. ppszName is a pointer to a pointer to a null-terminated string, whereas pbFoo is a pointer to a byte-array. Both of these are void, but what* they point to is important.
Additionally, you can have the pairing of pbFoo with pcbFoo, where pbFoo is a byte array, and pcbFoo is a pointer to the size of pbFoo.
All of that can be checked in definition. So there is no need to do that but it can be convenient sometimes. It's a question of balancing how much redundant information you want to put in the name to save looking up the definition. If you deal with some pointer however you better look at definition anyway unless you want to risk getting in trouble, so there is no need to put any of that in the name.
Void pointers are a major mess which should be avoided altogether unless there is really no other choice. And putting anything in the name of the void pointer doesn't really prevent it from pointing to something else entirely.
Please, stop mod me down ! In my work, I have to work with developers from many countries. Understanding cultural differences is very important when many nationalities are involved. There was nothing pejorative in this one.
IMHO, You're not contributing just offering anecdotal evidence that could easily be construed as an unsupported attack.
For instance, I'm not a native English speaker and I don't see the use of Hungarian notation (specially when used heavily).
There's a difference between a couple of handy conventions and things that actually make your code hard to read and less flexible (change the type, change the name too).
I'll contribute something I noticed, as a native English speaker, when I started defaulting to too-terse in my own style: Instead of spending time worrying about how to capture enough meaning in a concise name, I can write "km", write a comment beside the declaration like "keyboard mapping", and I'm done. The symbol-meaning is mnemonic, the documentation is verbose.
Or to consider it another way, we've constructed the idea of variable naming being essential under the premise that we want code to relate to native language at each step. But when code is put into an interlingual context this breaks down relatively quickly - at the extreme end, the non-native speaker has to reverse-engineer meanings anyway. In the terse/documented style, I more explicitly acknowledge this separation of actions and definitions.
A lot of the thinking around naming conventions feeds into the expected workflow - when considering the pre-Intellisense, namespace-free era of C coding that Windows Hungarian arose in, it makes sense to bulk up the names a little so that every point of the code conveys more meaning and doesn't collide by accident. But if the environment already gives you ample guidance towards meaning and categorization, the bottleneck revolves more around how much code fits onscreen.
Your stack, globals, and local heap were all allocated out of the same 64KB segment. Any locals in WinMain are effectively allocated for the entire lifetime of your app, and took up some of that limited space.
Normally, you'd save the stack space by putting the structure definition in a separate function, and you'd regain the space on return from the function. But for these Hello World apps, you're over-simplifying to death, trying to reduce the number of functions. I suspect the LocalAlloc here is vestigal, from an earlier hello app where the HelloInit function was inline.
I first got into programming when I was around 13. I wanted to make games (little did I know how big an order that was!). So eventually I stumbled upon code that looked a lot like this, and I started hacking it out. I would tweak things until I understood why it was there, and then move on. I guess since I didn't know that something better existed it was the only world I knew. I had a lot of fun playing around. What a different world we live in now though. Stuff that would have took me weeks to figure out can be done in 5 minutes today.
Stack was pretty small. Allocating structure on the heap instead of on the stack is pretty standard. We just need to understand the historic context before making judgement.
Brings back memories. I recall learning to write from memory the 40+ line win32 hello world from the Petzold bible. That's the way it was back then ... but you got super speed from it. Event loop was blazing fast. I discovered MFC a bit later and worked as a professional dev at an ISV that made use of it. I have fond memories of reading the v1 MFC manual - it was small enough at that time so that a person could understand the whole thing with relative ease.
Some interesting notes on what happened in a few years .. when I first saw .net Compact Framework, it blew my mind how easy it was to write .net code/WinForms stuff. Even though I was relatively poor, I bought an MSDN subscription on the spot and started to write mobile apps. This was early 2000s. If MSFT had paid proper attention to mobile devices, they could have rocked it.
I faced weird challenges. Companies would ask me for 20K to certify my free apps on their phones. It was disgusting. I sold a few apps on the hangango app store, and then got out of mobile dev (missed the whole iPhone revolution). Talk about bad timing.
it brings back memories indeed.. me watching in disbelief how long, ugly and weird Microsoft C/C++ Win32 or even MFC programming for Windows was, compared to Borland Delphi or C++ Builder. I tried, I really tried to like it but in the end I gave up and I became a windows programmer much later with the arrival of .net win forms and c# (created by Delphi architect wikipedia.org/wiki/Anders_Hejlsberg - it's scary how scarce were the sane language/framework/IDE architects back then)
For those of us watching from the Unix world while Microsoft fumbled its way towards world domination, it was an incredibly frustrating time to see the banality of their technology become 'the thing' in light of so many, many, forgotten and ignored mysteries. It was something like what it might be like to see a new religion form for your children while watching that of your parents die.
Thank Linus we still have an alternative to Microsoft, here and now! It is absolutely a huge gain for us that Unix was not eradicated by the feral, vermin new thing.
Oh but there was VMS and Lisp and others that are gone, with all the ideas. All replaced by the Unix way. The Unix way is gone, too. Linus should be given some credit for that as well.
BSD has existed long before Linux was even a vague idea in someone's head, and the free software BSDs were already springing up independently of Linux, though it took until 1992 for 386BSD to be released. Patchkits soon followed with FreeBSD and NetBSD popping up quickly.
Linux or not, it would have happened. The difference is that if the lawsuits didn't encumber the community and create uncertainty, it could have been the dominant Unix-like today.
I have no idea what the hell you mean by "Linux making the idea popular to culture".
I've been there and aware of the situation since the beginning. BSD may have been 'before Linux', but it wasn't the best-promoted set of the group of Unix OS's .. it was always easier to get Linux than BSD. If Linux wasn't doing all the stupid shit that made people re-consider Unix as a personal operating system, I don't think it the idea of Unix on the desktop would have gained as much traction.
Anyway, this is all "what if" line of thought .. just my opinion. But I've been a Linux user since the days of the minix-list, and a user of Unix-based systems since 1980.
Wow. Interpreting this under the "hello world is often a demonstration of a particular point" rubric ( http://www.win-vector.com/blog/2008/02/hello-world-an-instan... ) it looks like Petzoid was trying to demonstrate that a mere finite amount of code could launch a Window API client. Large was, of course, undesirable- but he was trying to see if the API costs were even bounded.
I really miss the "old days" of programming. Things were simpler then. I fondly remember the early 80s programming on a Commodore 64, then moving to the early IBM machines and DOS.
Even though I favour a nix environment, there is more sanity in Windows programming circles methinks. I have been thinking about getting into some .NET development these days and moving away from doing nix stuff, altho I really love the BSDs.
It's pulling teeth to get 3rd party library developers to make anything up to the standard of the BCL. A lot of times it feels like someone just copied a Java library into C# and did a bunch of global text replaces until the compiler stopped complaining. The BCL is very extensive, but if it's not already in the BCL, .NET is a pain.
And then MS came up with MFC which was slightly better. But only slightly.
Borland had a better solution if I remember, in their C++ Builder thing.
Another "adventure" I had in Win32, trying to make a dynamic dialog box (basically, selecting something in a list opened up a list of properties - like the Preferences dialog box in most browsers). It was some time before I figured out how to do it (there was a coordinate conversion needed)
But it was always behind Microsoft's MFC in terms of features. And you had to drop out of it for advanced stuff and then you had to spend a bunch of time figuring out how it actually worked.
> And you had to drop out of it for advanced stuff and then you had to spend a bunch of time figuring out how it actually worked.
This was really no less true of MFC, and imo the OWL code was a lot cleaner and easier to understand. In the end, though, they were both just wrappers around the windows api.
But the GP seems to be referring to VCL, which is what came with Delphi/C++Builder. One of the reasons OWL was usually behind MFC was that Borland basically abandoned it and their mainline compiler products for the Delphi line, and C++Builder was always a second-rate product behind Delphi as well.
MFC was much more of a thin wrapper than OWL was, IIRC.
Microsoft concentrated on bringing MVC-enabling classes for application development, and otherwise just basically wrapped the raw Windows API, whereas Borland actually created an honest-to-gosh object library to represent the GUI (and to an extent, provide a migration path from TurboVision, their DOS GUI library).
VCL was even thicker still, having come from Delphi's rather successful attempt to duplicate the Visual Basic VBX components in a highly-extensible and more native way.
Like you hint at, it was kind of a weird fit for C++Builder, since Object Pascal had somewhat different capabilities that they had to lay into CB making it a little bit of a bastardized spinoff of C/C++.
.NET/C# was ultimately what came out of that heritage, after MS poached Delphi's architect. I think it benefited from not claiming to be C++.
I read a story somewhere that MFC was originally much like OWL, but when Microsoft gave it to their test groups they found it too OO was thus MFC as thin layer was born.
It was some time before I figured out how to do it (there was a coordinate conversion needed)
That sounds overly complex - all you need to do is to catch the listbox selection message, hide all the controls that shouldn't be shown with the current list selection, and show all the ones that should be. I don't see how a coordinate conversion would be needed for that.
I couldn't even imagine the breath of fresh air it must have been to have worked with Windows programming in the 80's and then discovered something like NeXTSTEP.
Actually it wasn't as revelatory as you might think.
Windows programming was irrelevant in the 80s -- it wasn't until Windows 3 and 3.1 that it really caught on. In the DOS world there were lots of "character-mode windowing libraries" (I mean lots : I think we once reviewed 15 in a single issue of Computer Language) and they all generally had fairly straightforward programming models. People were much more focused on, for instance, memory management.
When NeXTSTEP came out, word traveled by the magazines and BBSes and Compuserve fora. You couldn't really communicate the NeXTSTEP programming model in pictures and so the only people who I think really got NeXTSTEP were people who'd already experienced Smalltalk! (Which, at the time, was at it's peak, with Smalltalk/V available on DOS machines and ParcPlace available on UNIX systems.)
Coming from DOS programming where you had to draw your own "windows" using using extended ASCII characters and listen for interrupts to handle mouse clicks I thought Win32 API was awesome. I think NeXTSTEP would have blown my mind if I had chance to touch a NeXT box.
You spend a goodly amount of time looking at the interface builder and trying to figure out why you did not have to add code to the button or subclass it. There is a definite "wow" moment.
Drawing your own Windows was great because every application had its own character and style. It was an art. Then Windows came and made everything boring. (I see Zuckerberg following Gates footsteps--let's make the interface boring.) My mind was blown by Amiga--the music, multitasking, deluxe paint, video editing, 3D animation, it was a machine for real professional artists that made Windows & Mac look like a Timex Sinclair.
It was pretty amazing although VB probably gave a lot of people the same reaction. NeXTSTEP and the Newton were probably the two biggest reasons I always hated Windows programming. Both had great interface ideas that Microsoft should have stolen.
I swear I muttered "OnOK" in my nightmares for years.
> That version probably has as much relevance to real-life Windows programming as printf("hello world") has to real-life character-mode C programming.
A simple program has a lot of advantages, it shows you can actually compile and run a program. When you have never done so before, there are 100s of hurdles that regular developers would not even consider when starting out.
I remember back in the day being annoyed and frustrated that the two functions in the canonical Petzold "Hello World" were WinMain() and WndProc(). "Why not WinMain() and WinProc() ?" I used to silently scream. I must be getting more mature because I am not quite so pedantic about these sorts of things these days.
"Signals have been around since the 1970s Bell Labs Unix and have been more recently specified in the POSIX standard." --wikipedia
Basically, they're a software interrupt handler. Not too different from assembly. I don't see why that would be innovative at all?
In fact, early C++ just compiled to C. You can write structs with function pointers and vtables all in C if you wanted. People have been doing that since the 80s.
Forget the Posix/Win32 platforms, function pointers can be easily found in the Standard C library itself. I believe there are atleast 2 functions taking function pointers as parameters:
1.>'qsort' function takes a function pointer to decide how to compare your data in order to sort it, presumably using quick sort.
2.>'signal' function has a Posix origin, but it did manage to make it into the C standard. It takes a function pointer to decide which function to call on getting a particular signal.
I was using C function pointers routinely in 1991, way before I ever touched Windows. It was a straightforward thing to do after using them in assembly language throughout the 80s.
I had that book (Programming Windows) as a preteen. This is actually Win16 API programming--the event loop gives it away. This was before Windows had preemptive multitasking (which came with Win95, IIRC). If you forgot to yield at the bottom of the loop you'd lock up the whole OS.
As a unix/c coder in the early to mid 90's, the future seemed to be Windows and I dreaded having to learn that gobbledegook. Then Java came along and I never had to. Thank goodness.
http://ozz.ie/h31100