Turbo Pascal

Turbo Pascal is one of those development tools from the late 80's / early 90's that I had heard of but never actually used or even seen. In fact, the Pascal programming language as a whole I can describe the same way as it pertains to myself. Until 2018.

I actually now greatly regret that my younger self in the mid 90's did not get a chance to use Turbo Pascal at that time. As I've written about before, I started out with QBasic and then moved on to C/C++, and then soon after went back to QuickBASIC 4.5 while adding assembly into the mix to do more performant graphics. The reason I ended up moving back to QuickBASIC from C/C++ was that I greatly preferred the easier/quicker edit-compile-run cycle and the simpler debugging environment (plus the inline help system in QBasic/QuickBASIC is really awesome too).

That last point is what really makes me, in 2018 looking at Turbo Pascal for the first time, think to myself: "Wow! This is what I really would've loved at that time!"

I've actually had this since early this year but only really dug into it over the summer. I originally got it after reading posts on some vintage/retro computing forums with a number of people praising Turbo Pascal as a great development tool mixing a great IDE, ease of use, inline help and advanced features (pointers, inline assembly, etc). I was intrigued and I figured that maybe it might be good to play with a bit, as I had been interested in getting into FreePascal and Lazarus a bit anyway (maybe, for various reasons which I could probably write a whole post about).

So what do I like about Turbo Pascal and why do I think my younger self would've really liked it as well? In no particular order:

  • Easy to learn language with (to me anyway) a kind of "BASIC-like" feel to it.
  • Really awesome modules/units system. This honestly feels like it was way ahead of anything else at the time. It makes code re-use between projects incredibly simple. Just copy the TPU file (compiled unit) around and reference it in your program under the uses directive to start using it. Really easy.
  • Pointers!
  • Inline assembly (though, 16-bit / 286 only). Huge deal for me, as I remember getting annoyed at debugging assembly issues in my QuickBASIC/assembly mixed language projects. The 16-bit only limitation can be worked around in a hacky way by directly writing hex opcodes into your inline assembly if needed. It amuses me that I've actually heard Turbo Pascal referred to as a "shell for assembly", referring to the fact that a lot of projects where speed really mattered would have large amounts of inline assembly everywhere. For me in 2018, if I ever work on a project that gets to that point, I'd likely just switch it over to C honestly.
  • Slightly more advanced type system than BASIC, but still quite simple. One thing I've noticed so far is that I do generally feel a bit safer writing Pascal code that uses pointers (for example) then I do with C, I think due mostly to the type system.
  • Very fast compiler! And your code is actually running compiled (even from the IDE) rather then being interpreted, as in the case of QBasic. So your code is going to be a fair bit faster then QBasic right away. One side-effect I've noticed as a result of the blazing fast compiler is that I'll often compile my code as I write it (even if I don't intend to run it quite yet) simply as a check for any syntax mistakes, etc.
  • Beginner-friendly IDE that is very fast and that allows you to immediately start writing code and running it (exactly like QBasic). Also includes debugging support that is roughly on par with QuickBASIC (but does have some extras, like inspecting register values while stepping through your inline assembly).
  • Syntax coloured highlighting in the IDE!
  • Inline help system with search and plenty of examples.
  • Run-time checks to help you debug code (which can all be turned off for additional performance).
  • The IDE runs itself under DPMI (optionally) so that your code (which always runs in real-mode) has access to all of the remaining 640k conventional memory. This is a massive improvement over QuickBASIC! I very vividly recall getting really frustrated with one of my late QuickBASIC projects which was becoming impossible to run from the IDE due to it always sucking up 200-300k of conventional memory.

I've often read from people who learnt to program as kids in the 90's that they progressed from BASIC to Pascal and then to C/C++. I can kind of see now why that progression makes sense. To me, Turbo Pascal really does feel like either a much more advanced QBasic, or a kind of C/C++ "on training wheels" (sort of).

Turbo Pascal 7 also includes some object-oriented programming features. Actually, this was introduced I think in Turbo Pascal 5 or so, but people seem to say that it wasn't until version 7 that Borland had ironed out most of the bugs. I don't see myself using any OOP features that much (if at all), for the same reasons I now stick to C. I just prefer the procedural approach.

The limitation of your code needing to run in DOS real-mode is unfortunate to me in 2018, but if anything this just enforces developing for DOS the way it really was done for most of it's life... with 640k conventional memory, heh. Of note, Borland Pascal 7 (from what I gather, is the "professional" version of Turbo Pascal) apparently did include some ability to have your code run under DPMI and also added some 32-bit / 386 assembly support. However, I've read enough bad things about Borland's DPMI support in general that I'm not particularly interested in trying it out for myself.

For my current "retro" programming projects, I don't see myself using Turbo Pascal to tackle some of my more ambitious ideas (such as the Doom-style engine I still want to try doing), but for simple 2D graphics stuff I actually think this will be an interesting alternative to C.

The quicker edit-compile-run cycle is definitely handy. It makes prototyping or otherwise just quickly experimenting with ideas much easier. On my 486, it feels like instant compile times except maybe if I'm doing a full rebuild (which still completes in maybe two seconds). Contrast that to Watcom C where even for simple changes to a single source file, you're still waiting at least several seconds (if not longer). It makes a big difference over a day spent working on a project. I guess that is why many people who do retro programming today tend to use DOSbox or something else on their modern computers. I still refuse to go down this road though, preferring to stick with completely era-appropriate software and hardware!

"Old School" Magic: The Gathering

Over the past two years I've got back into collecting and playing Magic: The Gathering a little bit.

I was first introduced to the game by some friends in my fifth grade class in 1995 (actually, it was a grade five/six split class, I was in fifth grade though). I remember we used to actually sneak in games during class. I was really intrigued by all the cards and the artwork especially. Thinking back now, I recall that the majority of these cards were from the Fourth Edition, Chronicles and Ice Age sets, which all would have been current at that time. One card that I remember one of my friends had which sticks out the most in my mind that I thought was just the coolest creature card ever:

I mean, a 7/7 flying dragon that makes your opponent discard their entire hand when they take damage from him. Wow! I really wanted one! Of course, only years later did I realize that it was really a rather poor card from a playability perspective. 8 mana total casting cost with a 3 mana upkeep? Both of these costs comprised of 3 different colours? Yeah, no thanks! By the time you had enough mana out on the table available for use, the game would probably be almost over... if it even got that far.

But then again, thinking back to how I remember our games going at the time... a lot of them really did go on for a long time! I certainly don't remember anyone at the time having optimized decks. Everyone I knew who played was younger (11-12 years old) and it was their parents who were buying them cards. As a result, assuming you even had enough cards to build a deck (60 cards or more), you were playing with what you had. It might have even been with all that you had. Which probably meant you were playing with some really "janky" stuff. Maybe even (*shudder*) a four or five colour deck! So, the idea of playing something like Nicol Bolas in your deck at the time didn't really seem so crazy as it does to me now. And no, none of us had dual lands, and certainly not any of the power nine cards!

Regardless, even though I really wanted a Nicol Bolas card for myself, it wouldn't be until 23 years later that I got one, heh.

Getting a specific card wasn't even the first problem for me at that time. Getting any cards was the problem. I didn't have any money. Heck, I didn't even know where to go to buy Magic cards in the first place. I lived on a farm near a small town out in the "middle of no-where." There were no stores around that sold such things (so far as I knew). Darn! As luck would have it, later on during that school year, one of my other friends who had moved away the previous year (but who I still would occasionally go and visit, spending the weekend at his house) gave me his collection of Magic cards! I didn't even know he had any, but I showed up one weekend, and noticed he had a bunch in his room, carelessly strewn about. I asked about them and he replied "Do you want them?" I was absolutely thrilled. I think it ended up being a little over 100 or so cards all told. Again, they ended up being all from current (at the time) sets. A majority of Fourth Edition, and a smattering of Fallen Empires, Ice Age and Chronicles. A lot of the cards were in poor condition, clearly having been played many times over asphalt at school during recess or lunch before my friend ultimately got bored of the game. Finally, I could play with my own cards!

My younger brother was also interested in the game after he saw these cards and I remember we would play at home. I don't think we had quite enough cards to build a deck for each of us (I seem to recall we were somewhat short on lands) so what we ended up doing was sharing the same deck. We would play as normal, but both would draw our cards from the same deck. At first, we didn't have a rulebook so we were playing from my recollection of the rules that I learnt from my friends at school... and even that wasn't so perfect (plus I don't think we were 100% correct in following the rules during our games at school anyway). A few things I recall us doing incorrectly: we allowed attacking one (or more) of your opponents creatures directly, there was no distinction between sorceries, instants or interrupts (and I don't even think we ever played them during the opponents turn... not sure we understood that aspect of interrupts and instants), we allowed attacking with walls, regeneration could be played from cards long-since put into the graveyard. There's probably more I'm forgetting, but if you're familiar with the actual rules of the game, that should give you an idea of our games. Also, I do distinctly remember that, in an effort to not upset the other, we wouldn't attack at all until we had run out of cards in our shared deck. At which point it would turn into a real battle-royale! Early on for our games we did attacks early in the game (as soon as a creature was in play), but due to the hodge-podge of cards in our shared deck, games would often be very one-sided, especially early on and the early attacks ended up just upsetting whoever was taking a beating so we stopped doing that. Hey, we were both young after all!

The summer after school ended that year, I remember being in the mall with my Grandmother (during a visit to her place, there was no mall within an hour's drive of my house) and her buying me a Fourth Edition starter deck (60 cards!) and a couple booster packs of Alliances at some kiosk near the food court that sold Magic cards. My brother also got some cards at some point (if I recall correctly, a Mirage starter box later that year). Our games started taking better shape, which was good because I was also playing with friends at school less and less as time went on.

I ended up staying somewhat into Magic cards into 2001/2002 or so and kept collecting here and there (in particular, I remember getting quite a lot of Mercadian Masques in 1999 and early 2000). I was in high school at the time and in 2001 I remember I discovered one of the history teachers left his room open for students to come and hang out in during lunch. There was a group of -- well, I can only describe most of them in one way -- "comic book store"-type nerds who played Magic there during lunch. I hadn't played the game with anyone other then my brother in a few years at that point, so I was excited to play against some new opponents. I "endured" it for a while, but ultimately got put off playing the game by this group. Really, just a rude bunch of players that clearly weren't there to have fun but rather, seemed to have their fun by insulting people like me who were not playing with well optimized decks and had less overall experience playing the game. So, I put the game aside from a number of years.

After finishing college in 2007, I was contacted by my friend who had originally introduced me to the game in 1995 inviting me to come to a "draft" he was organizing. I hadn't played in quite a while and the game, as I discovered when I went to this draft, had changed quite a bit in the look and feel of it. The core rules were of course mostly the same, but the look had changed to what I always felt was a more generic or even "sterile" look and the artwork on the cards generally felt a lot less inspired and seemed to lack the character or "charm" that a lot of the original cards that I remembered had.

But more then that, after attending a few such drafts over the next year or two I began to realize that the sets now were largely designed with drafting in mind. There weren't really any of the fun, imbalanced, and/or just plain weird cards that you would see in the older sets. This never really felt that fun to me, it just added to the generic/sterile feeling I was getting about the game. At some point in 2008 or 2009 I declined going to further drafts just saying that I'd basically lost interest in the game.

Fast forward to 2016. I don't really remember what made me want to look up Magic again. Probably I was looking at the box in my closet that had all my old cards in it. But I started thinking about how I enjoyed the older cards much more and wouldn't it be cool if there was some group of people out there who played strictly with these older cards? After googling a bit, I discovered that, indeed, this was actually a thing! Not an official format mind you. But heck, that's probably a good thing anyway given my dislike of where the game has gone over the past 15-20 years.

Most people interested in this format stuck with cards from the original sets released in 1993/94 which was a bit before I started playing, but since Fourth Edition and Chronicles were comprised of all reprints of cards from the original sets, it was all the same cards to me anyway. Great!

Looking up some cards on eBay and such ... wow, Magic cards sure are expensive! Especially the older cards! However, I proceeded forward and ultimately, much of my disposable income in 2016 went into buying older Magic cards. Eventually, I was able to piece together quite a collection of cards that fit into the much more strict Swedish rules for "Old School 93/94" Magic, which is the ruleset that I had decided at the time that I was going to build for (mainly because I largely consider the much cheaper Revised edition cards to look somewhat ugly).

Currently, I am happy that I've been able to build three separate decks for this format:

They are most certainly not the best most optimized decks out there, and my ability to actually play the game is still rather limited due to not getting much experience at it, but even still, when I do get to play I enjoy it. This era of Magic was just a lot more fun to me and I attribute that (and only that) to my regaining interest in the game over the past two years.

Unfortunately for me, while there seems to have been a much bigger "Old School" Magic community here in Toronto years back (see here and here), it has diminished drastically since then. I currently know of only one other person in Toronto who plays and one other in Hamilton. I meet up with the person from Toronto every other month or so for some games and we have fun, but even still, it doesn't scratch my itch to play a bit more.

Thankfully this "lack of players" problem seems to be common, which you can imagine for what is a very niche (and especially now in 2018, prohibitively expensive) format of the game. So people in the online community from around the world have started playing games over Skype and other webcam-enabled methods of communication. I've only played one game this way so far, and wasn't sure what to expect exactly (I was imagining lots of connectivity and audio issues, as that's how almost every Google Hangouts session I've ever been in has gone), but it went better then I expected and I'm looking forward to playing more this way! It's nice that in 2018 there exists another way to connect all these people who enjoy this particular format of the game.

Recently, I figured that since my introduction to the game was largely via the Fourth Edition set originally released 23 years ago in 1995, that I would treat myself to some sealed old stock of this set. Buying any sealed old stock of Magic and opening it is guaranteed to lose you money, so one cannot treat it as an investment, but rather as an indulgence into nostalgia.

I suppose it's important to point out that, with my goal of building my "old school" decks within the boundaries of the stricter Swedish rules, using Fourth Edition cards in my decks is not possible. Unless I decide to follow the more lenient Eternal Central rules, which I may do at some point given the way that prices are going up recently... At any rate, I bought this just for fun, nothing else!

One of the two player "gift boxes" for Fourth Edition. Containing two 60 card decks, a rule book (a slightly bigger one then you'd get ordinarily with a starter deck box), and a (in my opinion) kind of nice flannel bag with glass counters. There was also a mostly equivalent gift box set for Revised Edition, but it's far more expensive due to the possibility of it having dual lands (which are expensive cards).

The bag holding the glass counters had broken open on it's own somehow over the past 23 years and they were scattered in the box when I opened it, but no harm was done. It's kind of funny seeing the mail-in response cards. A part of me wants to try sending it in, but I think I'll just leave it here in the box for completeness. The little black flannel bag to hold the counters is a bit nicer then I expected. Not super great quality or anything, but after seeing this now, I think I'm definitely going to use it with these glass counters for all my games going forward. As I understand it, the counters were intended to be used to track the the player's life during a game. However, nowadays there are mobile apps that do this task just as well (if not better). Instead, using these to track tokens and counters as needed for cards during a game seems a much better use to me.

But let's get on to the sealed decks. As per the description on the back of the box shown above, "this box contains everything two people need to play", so what will these decks actually look like?

Heh. So, if you've ever opened a Magic starter deck box (60 cards), then you'll instantly know that despite the fact that these two "decks" were packaged differently in a "two player" set ... this gift box just contains two normal starter decks. That is, 60 randomly assorted cards with lands, and the usual amount of rares, uncommons and commons. They're not specially prepared into anything even remotely resembling what one could consider to be a playable deck.

Well, actually now let me think about that, heh. If I think back to how I used to play this game with my brother back in the mid-90's (described above in this post) then actually for me, these two "decks" here are actually quite reminiscent of how we played! Lots of random cards with no theme or strategy. Just a "play with what you have" kind of feel to it. Now, if you were a player who had money and was not a young kid just getting into the game and had the knowledge to construct an optimized deck, then no, these two decks most certainly are not "playable" to you.

Aside from all of that, it brought a big smile to my face shuffling through these cards. It's always fun to see a Craw Wurm. Who doesn't like big creatures. And 6/4 was pretty damn big! With the way my brother and I played back in the day, Howl from Beyond was a very strong card given that we typically left our attacks until the end of the game (at which point you had a lot of lands out and could hugely buff one of your attackers... and at that point, we only had one Howl from Beyond, so it often was a game winner). I remember we both never really thought much of Erg Raiders ("why would you want to play a card that hurts you every turn you don't attack with it!?"). Always nice to see a Lightning Bolt of course. Terror was a card that we also had only one or two of back then, and it was something we loved drawing and using to instantly kill something of the others. Cards like Holy Armor and Firebreathing were also quite sought-after for us during games, as anything that buffed your creatures was good for our typical end-of-game battle-royale-style attacks. For me now in 2018, seeing Greed, Hypnotic Specter, Fellwar Stone, Strip Mine, Millstone, Power Surge amongst others are also all quite nice.

Fourth Edition, and to a little bit of a lesser extent, Ice Age will always remain my favourite sets of Magic. Not just because it's what I started playing with, but also because I always thought that the cards looked their best (highly subjective opinion of course) at this early point in the game's life while still retaining the majority of the original cards and artwork (in the case of Fourth Edition anyway). Chronicles also continues that, though with a smaller pool of cards. Revised edition, as I mentioned earlier, looked rather ugly to me. Alpha, Beta and Unlimited edition cards look rather nice (I prefer Beta to the more rounded corners of Alpha), but to me there was always a certain ... crudeness (?)... to them. I'm not sure if that's really the right word honestly. Perhaps it is, but I feel someone will read that and get an exaggerated impression of what I mean. They were of course the earliest editions (as evidenced by the names "alpha" and "beta", heh), but they did always have a little tiny bit of an unpolished feel to me. However, that feeling could largely be because I didn't see these editions until after I was already accustomed to Fourth Edition, Chronicles, and beyond. I can see how these simpler looking cards from the earliest editions would be more appealing to some. And to be clear, I am not saying I dislike them by any means. Quite the contrary as I covet my existing collection of Alpha/Beta/Unlimited cards!

(Just ignore the blatantly obvious centering issue on the Fourth Edition card on the bottom right, heh. That kind of thing happens in any edition.)

I suppose this all just helps to demonstrate how our early encounters with things colour our perceptions of it later on.

Updated libDGL Code

A quick post just to point out that I updated the libDGL Github repository with the most up to date working code that I currently have.

Since I originally pushed libDGL code to Github last November, not much new functionality/features has been added. Kind of disappointing for me to think about actually, heh. That being said, over all that time, I do feel like I fixed up a bunch of bugs and generally improved the performance of what was there. However, looking at my to-do list that is left for libDGL, I still really have my work cut out for me:

  • Scaled/rotated blitting support
  • Blending
  • "Mode 7" like support
  • Custom font loading (BIOS-like format?)
  • Joystick / Gravis GamePad support
  • Input device (keyboard/mouse/joystick) events
  • PC speaker sounds
  • Sound Blaster compatible sound/music
  • Gravis Ultrasound compatible sound/music
  • Sine/cosine lookup table optimizations
  • BMP, LBM, GIF image loading (and saving?)
  • Simple immediate mode GUI

This list is definitely not in any particular order. I want to start building a simple 2D map editor tool (since the old QBasic one I have sitting here is missing source code, so I cannot even just extend it as a quick alternative), so the last item about a "simple immediate mode GUI" is probably going to be my next task.

Following that, I kind of what to do something with audio. I've been focusing a lot on graphics lately and feel like a change would be nice. More specifically, I think starting with some MIDI playback might be fun. I just recently picked up a Roland Sound Canvas SC-88VL (through which, MIDI songs sound absolutely exquisite) and this is most probably influencing that decision, heh. However, I think I'd likely want to start with writing MIDI playback code for a Yamaha OPL as that was far more commonplace, but supporting general MIDI devices also sounds like a nice second step.

Fixing Up an IBM Model M2 Keyboard

A few weeks ago I picked up a Model M2 keyboard from eBay. I'm not a raving fan of mechanical keyboards but I definitely agree that they are quite nice to type on. I actually got a Das Keyboard Model S Pro years ago but haven't used it recently since it's a Windows keyboard layout and my modern computer is a Mac and I've just grown to hate using Windows keyboard layouts on Mac. But otherwise, it's quite nice and I wouldn't hesitate to recommend a Das Keyboard to anyone. Unicomp also makes apparently very nice mechanical keyboards along the style of the original IBM keyboards but I've not tried these personally.

At any rate, I picked up this Model M2 to use with my "retro" computers. Not really for any particular reason other then it feels era-appropriate and is a nice relatively compact buckling spring keyboard. Many more people would prefer the original Model M over this, but I've always been put off on getting one of those due to the bulky size.

The Model M2 is infamous for having bad capacitors. The two capacitors on the controller inside will apparently go bad (dry out) quicker if unused for long periods of time, so even a brand-new-in-box M2 keyboard isn't guaranteed to work.

And as if bad capacitors weren't bad enough, the M2 is also infamous for being difficult to take apart and to reassemble (probably harder to reassemble then to disassemble I think). Apart from two screws on the bottom, the majority of the keyboard is held together by somewhat easy-to-break plastic clips internally that need to be very carefully opened. Oh boy.

It should be noted that not all Model M2 keyboards are mechanical. Certain ones made by Lexmark with a model number beginning with '7' are rubber dome. But otherwise from the outside they look identical.

This one I got arrived and initially didn't work when I plugged it in to give it a try it. It gave the tell-tale sign of bad capacitors where only two LED lights flashed on and stayed on and no key-presses were ever registered. However I noticed that by unplugging it and replugging it in that it worked perfectly. I continued using it for a couple weeks like this and all was good, but I knew that this wasn't a long-term solution and that I really did need to go and replace the capacitors.

Onwards to disassembling!

First thing is first. Take a picture of the keyboard before you take anything apart. This is so you can use it as a reference for where the keys all go when you're putting it back together.

To begin, the keycaps all need to be removed. This is because the aforementioned plastic clips that hold the keyboard together are underneath the keycaps. And you'll probably want to clean the keycaps anyway. Mine actually weren't that dirty as you can see from the above photo, but I still cleaned them anyway.

Removing the keycaps is really simple. You can use any thin flat tool to pop them off. I began by popping off all of the square keys. A number of the longer keys such as Backspace, Enter and the space bar have additional little brackets that attach to the bottom that need some extra care, so it's best to save these for last.

With keys like Enter shown above, I found it easiest to pop up the keycap first in the same exact way as I'd done for every other square key, but before trying to lift it off completely, you take a some dull flat/thin tool and press down the bar so you can easily slide it out. It's very easy, but you do need to be careful as the plastic that holds the whole bracket to the keycap is very thin and easy to break!

The space bar is a little bit different then all the other keys. Again, I started by popping it up in the same way as the other keys, but again, before trying to lift it off, you need to release the bracket. This one is different then the other brackets and is held down by two little bars that need to be pushed out. You push the left one out to the left and the right one gets pushed out to the right. Use a dull, flat, thin tool again to push them out. They are a little tough to push out, but once you get the first one the second one is easy. Again, be very careful as the plastic is thin and easy to break!

Now the keycaps are all removed.

At this point, you'll want to take another picture. This is important because as you can see, not all of the holes have springs in them! All of the missing springs are the extra holes that are covered by the longer keys which all only need one spring each just like the smaller square keys.

If you're cleaning the keycaps, get some soapy water ready and let them soak for a good hour before doing any scrubbing. That'll give you plenty of time to do the rest of the disassembly and maybe even get the capacitors replaced too depending on how things go.

As you'll be able to see in the above picture of the keyboard without the keycaps on, there are 13 small plastic clips that need to be separated to remove the top plastic half of the keyboard. You again use your dull, flat and thin tool to separate the two plastic parts of the clip, but I found that they would not stay separated and trying to push down to move the other half of the clip lower so it would not reattach was tricky and not guaranteed. The whole plastic of the keyboard is somewhat flexible so if I got one clip to stay separated, once I picked up the keyboard there was a very good chance the whole thing would flex a tiny bit and the clip would somehow find its way back and snap together again. Super frustrating!

So, I figured I needed something to wedge many of the clips apart while I pried off the top plastic cover on the keyboard. Not having much in my apartment to use for wedging, I turned to my little stack of spare computer expansion slot/bay covers. This actually worked much better than expected and I was able to turn the keyboard on it's side to begin prying it apart with the confidence that the clips would stay apart.

If you decide to go this route, whatever you use as a wedge should really be thin and hard (so as not to bend/flex while wedged in the clip). A very common complaint about these keyboards is the ease with which these plastic clips break, so you really don't want to flex them more then you have to!

As you can see in the photo on the right, you need to pry apart the top and bottom halves of the keyboard from the side. I started from the bottom and once I got it apart enough, used my finger nails to keep it apart while I worked on the top half. Eventually got it open enough to fit the tool I was using in. By this point, the expansion slot cover wedges that were closest to where I was opening the keyboard from were falling out, as expected, since there was nothing to hold them there once I started opening it.

I only had enough wedges for half the keyboard, so once I had it open enough so that the first 5 of the wedges had fallen out, I used my tool as a bigger wedge and left it placed between the two halves of the keyboard, set the whole keyboard back down and re-used those expansion slot cover as wedges, placing them into the remaining plastic clips on the opposite side of the keyboard. At this point, all of the plastic clips were either already apart or had a wedge in them and I was simply able to somewhat gently but firmly pull apart the top and bottom plastic halves of the keyboard as if I was slowly and carefully opening a book.

Luckily for me, I did not end up breaking a single plastic clip in the process! Hooray! Even if you broke a couple clips, it's not the end of the world. Hopefully though you don't break too many. If that does happen I imagine you could probably use a bit of hot glue to put them back in place.

I should point out that all the while you will probably notice and hear/see the buckling springs inside falling out of their place. Don't worry about this, but definitely do not try to close the keyboard again at this point else you'll probably squish and ruin some of these springs after they've been freely moving about inside. Just keep going with opening the keyboard and it'll all be fine.

Take all of the springs and carefully place them someplace safe for now, out of the way. We won't need them until we begin reassembly. You should also carefully peel up the thin black sheet/mat that the springs were sitting on. You can clean this if you like. I just brushed it off lightly with a dry cloth and then set it aside.

Some people report finding that the traces on the membranes corrode and go black or dark brown or whatever. As you can see, I didn't have that problem. Not sure what people do to fix that problem so unfortunately I cannot advise there. I would not recommend removing any of these membranes if you don't see any problems on any of the traces. It looks like it would be tricky to get them all back perfectly aligned and I've read comments from people saying as much. Just leave them as they are if you can.

And now the problem capacitors. You can kind of see in this photo that some of the contact pads underneath the capacitor has gone all dark brown, due to the capacitors starting to leak out. After seeing this, I was kind of surprised that the keyboard had worked for the couple weeks I'd been using it so far. I scraped off as much of the dark brown gunk from the solder as I could using my pocket knife, being very, very careful to not scrape any of the surface of the PCB surrounding it. Once I'd got enough of it off that I could see mostly solder, I got out my soldering iron.

You could of course try removing the controller PCB from it's position in the keyboard. As you can see there are several plastic clips holding it in place. It seemed that it would be quite tricky to remove to me, so I decided that I didn't want to risk breaking these clips. These capacitors are surface-mount, not through-hole, so technically there is no actual need to remove the PCB anyway in order to remove them. Plus it's only two capacitors and there is enough of the contact pads visible on each of them that I figured I wouldn't need to apply much heat anyway so there wasn't likely to be any harm caused by doing the whole re-capping with the controller left where it was.

Post-removal, I was left with more of a mess to clean up from all the leaking that had gone on over the years. Unfortunately I also pulled up a bit of the bottom contact pad while removing the smaller capacitor on the left. Whoops. With that in mind, I'm not sure I'm the best person to explain the process of using your soldering iron to remove these capacitors. But if you do still want to know what I did, basically, I just heated up the exposed area of one of the contact pads and gripped the capacitor with a pair of pliers and twisted the side being heated up away after I could see the solder had melted. Then repeated the same process for the other side. I think my problem was that I tried twisting the capacitor too soon when the solder wasn't melted yet, so twisting the capacitor away just ended up ripping up the contact pad in the process.

I used 99% alcohol and Q-tips to clean up the remaining gunk from the leaking old capacitors. I initially used my pocket knife to scrape up some hard bits without really thinking about it and ended up scratching a bit of the PCB. Dumb, dumb, dumb! Thankfully I didn't end up cutting a trace or anything. After this close-call I decided to just use my fingernail to scrape off the remaining bits of hardened gunk.

The replacement capacitors needed are a 2.2µF 50V and a 47µF 16V. You can of course go higher with the voltage, but should definitely keep the capacitance the same in any replacements you decide to use. Specifically, I used these two capacitors that I got from DigiKey.

I placed little squares of electrical tape down as a precaution. I wasn't sure if the capacitors would get pushed down and by how much (potentially putting the side of them in contact with the PCB) once the top of the keyboard was placed back on. Maybe it wasn't needed. Also, certainly anyone could do a better job of soldering then I did here, heh.

Before reassembling, it's important to test this out and see if the new capacitors are doing the trick. I took the keyboard over to my computer, plugged it in and powered it on and voila, no more stuck LEDs after a cold boot (without my previous unplugging and replugging it in trick)!

You can simply tap your finger on the membrane to test keys. I tested a bunch this way to make sure that everything was working fine.

At this point, it had been over an hour because I am kind of slow with these things. So it was about time to clean up all the keycaps and the top plastic cover of the keyboard. Once this is done, I set them all out, face up, on a towel and let them dry for a few hours. I actually used my DataVac Electric Blower Duster to dry off the top cover quicker as I wanted to get on with the reassembly sooner. But I did leave the keycaps to dry on their own for a few hours in the meantime.

To replace the buckling springs, you need to take the top plastic half of the keyboard and set it upside down, but with something to mount it up a bit higher. This is because when the springs are placed inside, the top of the spring will dangle slightly past the top of the plastic cover. So if you had it just resting flat on some surface you would not be able to correctly and fully insert each spring into it's little bracket. As you can see here, I'm using two hard disks on either side (because they were the only really suitable thing within reach as I sat down to do this, heh) to mount it a bit higher.

Then, using your previously taken picture of the top of the keyboard before you took it all apart, re-insert each spring into it's bracket, leaving the correct few spaces empty. It is absolutely important that each spring fits snugly into it's bracket on the keyboard cover. However, there's nothing to hold them in place other than gravity, so just be careful. When you're done this process do a quick once-over to ensure that they are all snugly in place. Trust me on this!

Now take the thin black sheet/mat that we removed and set aside before. Place it over top of the springs. Each of the holes in the sheet should line up with the various holes for the clips and two screws on the plastic cover. Unfortunately there is nothing to hold this in place.

And now is probably the worst part. We need to take the bottom half of the keyboard (that has the membranes and controller PCB in it) and place it on top of the top half of the keyboard with the springs in it. AND we need to do it while it's mounted slightly off the ground as we have had it thus far. This is incredibly important as otherwise the springs will pop out of place. In my case, holding the bottom half of the keyboard upside down did not result in the membranes falling out, but I would guess if that happens to you that you could use some small bits of tape to hold it in place. In my case, the black sheet would not stay in the bottom half of the keyboard while held upside-down so pre-placing it on the top half as shown in the above picture worked best for me.

Carefully hold the bottom half of the keyboard over top of the top half, lining it up while being careful not to accidentally shift the top half off of it's two supporting mounts and then set it down, pushing it together. DO NOT pick up the whole thing to attach the two plastic halves together You definitely want to leave the top half with the springs in it resting on your two mounts throughout the entire process. Go around all the edges and use your hands to squeeze all the edges together and you should hear all the plastic clips clip into place. If you pick it up (even slightly) to do this, you risk the springs falling out of place!

Apparently this exact problem happened to me with exactly one spring. Once I had reattached all the keycaps I was testing all the keys and noticed that the 'W' key didn't work unless pressed "just so." Taking off the keycaps again, I took a flashlight and looked down at the feet of the buckling springs.

It's maybe a bit hard to see in this picture, but the black feet of the spring for the 'W' key is very slightly crooked. The foot on the left side had somehow shifted out of place during reassembly and was outside of the plastic bracket that it should be sitting in. This was resulting in the key not pressing correctly (even though the sound of it pressing was just the same as every other key that worked fine). The fix for this was to take it all apart again and reassemble it. Not fun. So, be very, very careful when reattaching the bottom half of the keyboard to the top half with the springs in it! Take your time with it.

Once you've got that done, reattaching the keycaps is easy. Start with the spacebar and then do all the longer keys. Leave the simple square keys to the end as they are the most straightforward.

With every keycap, the goal is to have the top of the spring resting in the middle of the underside of the keycap. As you can see in the photo on the left, there is a small round slightly raised piece of plastic inside the bottom of the grove in the middle of the keycap. The top of the spring when inserted correctly will rest perfectly around that small round piece of plastic.

What is somewhat likely to happen when you're replacing the keycaps is that the spring gets caught on the open flat area at the top of the grove, or it ends up resting somewhere on the little plastic ramp thingy on the other side. If this happens you need to pop off the keycap and try it again. You'll know when you got it correct when you're able to press the key down and it makes the very same clicky sound as it did before you took it off in the first place. It it feels too mushy and, most importantly, does not make that clicky sound then the spring is not in the correct position. If you're not sure if it's making the correct clicky sound, assume that it's not correct and try again. If you're still not sure, try replacing a few other keycaps and compare the sounds.

Most of the longer keys have more than one grove. The grove that the spring goes in is always the one that has the top/bottom of the plastic cut away, as you can see in the photo on the right. I had a lot of trouble getting the number pad '+' and Enter keys on correctly. The springs just kept not sitting right when I popped the keycap back on. What ended up working for me was to tip the keyboard up, so it was resting on the top edge (IBM logo down), forcing the spring to be naturally a little lower (due to gravity) as I was inserting the keycap.

As you're replacing the longer keys with the bar/bracket thingy, use a tool to push the bar down slightly (and very carefully, you don't want to push it too much and break it!) so it fits under the clamps.

Finally, remember to replace the two screws on the bottom.

Heh, you probably can't even tell at a casual glance that that is a different photo then the first one I posted because the keyboard was relatively clean to begin with. This photo is definitely post-cleaning-and-fixing!

And that's pretty much it! I hope this helps someone out there. There are a number of guides to repairing and disassembling/reassembling the Model M2 keyboard that other people have written over the years but I always felt like there was some details missing, particularly with regard to disassembling. I wrote this post thinking about what details I would have loved to have going into this. It ended up being quite wordy, but well, sometimes (often) more details are better!

Using Watcom's Register-based Calling Convention With TASM

I suppose I'm writing this post for my own benefit primarily. I'll likely forget many of these details in a month, and then go and try to write a bunch more assembly and run into problems. So I'll try to proactively solve that future problem for myself. Everything here is better documented in the compiler documentation. However, it is scattered around a bit and of course isn't written with specific examples for using TASM.

One of the performance benefits that Watcom brought with it that was a pretty big deal at the time was that it's default calling convention used registers for up to the first 4 arguments to called functions. Past that, and the stack would be used as per standard C calling conventions.

As mentioned this calling convention is the default, but it can be globally changed via the CPU instruction code generation compiler switch. For example, /3 and /3r both select 386 instructions with register-based calling convention, while /3s selects 386 instructions with stack-based calling convention.

Borland Turbo Assembler (TASM) does not natively support this register-based calling convention among it's varied support for programming-language specific calling conventions. However it does let you use it's "NOLANGUAGE" option (which is the default if no language is specified) and then you can handle all the details yourself.

ideal

p386  
model flat  
codeseg

locals

public add_numbers_

; int add_numbers(int a, int b)
; inputs:
;   eax = a
;   edx = b
; return:
;   eax
proc add_numbers_ near  
    push ebp
    mov ebp, esp

    add eax, edx

    pop ebp
    ret
    endp

end  

This is pretty normal looking TASM. Complete with normal looking assembly prologue and epilogue code. Note that we are intentionally not specifying a language modifier.

So, first off, add_numbers_ has a trailing underscore to match what Watcom expects by default. If you don't like this for whatever reason, you can change the name here to your liking, but the use of a #pragma in your C code is necessary to inform Watcom about the different naming convention for this function.

Second, via the magic of the register-based calling convention, Watcom will have our two number arguments all ready for us in eax and edx. Our return value is assumed to be in eax, and that is correct in our case so we're all good.

The great thing is, we don't actually need to do anything fancy to call this function from our C code.

// prototype
int add_numbers(int a, int b);

// usage
int result;  
result = add_numbers(10, 20);  

But that was the simple case.

This register-based calling convention actually places the burden on the called function to clean things up before returning. This includes preserving some register values as well. According to the documentation: "All used 80x86 registers must be saved on entry and restored on exit except those used to pass arguments and return values." So, in our add_numbers_ function if we had wanted to use ecx, we would need to push and pop it during the prologue and epilogue code. But we didn't need to do so for eax and edx because those were used to pass arguments and return a value.

As mentioned previously, the stack gets used for arguments once all the registers have been used for arguments (by default, eax, edx, ebx, ecx in that order). In this case, the called function is responsible for popping them off the stack when it returns. So, if there were two int arguments that were passed on the stack, we would need to do a ret 8 to return.

; For this function, using the default register calling convention, the first 4 arguments
; will be passed in registers eax, edx, ebx and ecx. The last two will be passed on the stack.

; void direct_blit_4(int width4,
;                    int lines,
;                    byte *dest,
;                    byte *src,
;                    int dest_y_inc,
;                    int src_y_inc);
proc direct_blit_4_ near  
arg @@dest_y_inc:dword, @@src_y_inc:dword  
    push ebp
    mov ebp, esp  ; don't try to be clever and move this elsewhere!
    push edi      ; likewise, don't try to group the push's all together!
    push esi

    ; code here (that also modifies edi and esi, thus the additional pushs/pops)

    pop esi
    pop edi
    pop ebp
    ret 8
    endp

Is this all too cumbersome to worry about? Well, I don't really think it's a big deal, but there is a way we can remove ourselves from this burden.

Let's say we didn't want to have to worry about preserving any of eax, ebx, ecx, edx, edi, or esi regardless of how many arguments our function has and what (if any) return value it uses. Also, maybe we don't want to have to worry about popping arguments off the stack ourselves when our assembly functions return.

// define our "asmcall" calling convention
#pragma aux asmcall parm caller \
                    modify [eax ebx ecx edx edi esi];

#pragma aux (asmcall) add_numbers;
int add_numbers(int a, int b);       // no change to the function prototype is necessary  

What if we actually wanted to use the normal C stack-based calling convention for our assembly functions and ignore this register argument nonsense? Maybe you're using an existing library and it was written for other compilers that don't use this register-based calling convention.

#pragma aux asmstackcall parm caller [] \
                         modify [eax ebx ecx edx edi esi];

Watcom also pre-defines the cdecl symbol for this same purpose, which you can and probably should use instead of defining your own.

The empty brackets [] denotes an empty register set to be used for parameter passing. That is, we are saying not to use any registers, so the stack is used instead for all of them. With that in mind, we could expand the set of default registers used for parameter passing:

#pragma aux asmcallmorereg parm caller [eax edx ebx ecx edi esi] \
                           modify [eax ebx ecx edx edi esi];

In this case the modify list is redundant and need not be specified.

Of course, saying that your function will use/modify more registers means that the compiler has to work around it before and after calls to your assembly function which may result in less optimal code being generated. There's always a trade off!

None of the above #pragmas remove the need for the standard prologue and epilogue code that you've seen a thousand times before:

push ebp  
mov ebp, esp  
; ...
pop ebp  

The only exception is if your assembly function isn't using the stack at all.

There are many details I've left out. For example, passing double values will mean two registers will get used for one argument because doubles are 8 bytes. But if you only have one register left (maybe you passed 3 ints first), then the double value will get passed on the stack instead. Additionally there are more details to know when passing/returning structs. But I'm not doing any of this right now, so I've not really looked into it beyond a passing glance.