Being a long time demoscene enthusiast, I try to keep an eye on new productions that land at the various competitions held throughout the year. Back in 2011 I saw the release of a rather unusual scenedemo — a demo written for the Apple Lisa. It is called Le Requiem de Lisa by CMUCC (the Carnegie Mellon Computer Club) and is the only demo that had ever been written for Apple’s long-abandoned, first MC68000-based platform. Le Requiem de Lisa won 1st place at the Pixel Jam 2011 demo competition (category: “oldskool demo”) and through an odd coincidence I found myself interviewing two of the group members behind the demo, shortly after its release.
After seeing the demo for the first time I headed into my regular IRC hangout, #macintosh on DALnet, and asked the denizens if they’d seen this one yet. One of the regulars, Lincoln Roop (handle “ClarusWorks”) whom I had known online for several years, began giving me a surprising amount of detail about the demo. As it turned out, he was a hardware guy on the project, a member of CMUCC.
Lincoln was up for a few questions about his group’s Lisa demo to be shared here on Byte Cellar, so I fired away. A tidied up version of our IRC interview follows. I regret having sat on this dialog for just over four years now, but better late than never.
Me: What was the initial state of the Lisa you setup for this?
Lincoln Roop: A stock Lisa 2 — well, mostly stock. It has an IDEfile hard disk that I built, but the stock 1MB RAM, etc.
Me: Now, at the end of the demo there’s a six-color Apple logo onscreen…the Lisa 2 is a monochrome machine. What’s the story there?
LR: So, another club side project a couple of guys worked on is getting digital video from retro computers. They did some stuff on an FPGA and some custom PCBs. We added some shims to steal the digital video signals and contrast register settings off the Lisa’s I/O board. So, the Lisa has a software-adjustable brightness register. We send the digital video, 1 bit monochrome, along with that brightness register to our FPGA. Most settings are just white, and you can’t do much with this because of how slow changing that register is, but we could do it well enough to do a color apple logo at the end. When we presented this demo at PixelJam, we actually had the Lisa outputting HDMI to a projector and basically we interpret the brightness register as a color setting.
Me: So you were involved more on the hardware side than with the coding.
LR: Yeah, I was the one who restored the club’s Lisa to operation, tracked down the workshop install images, etc. and did the Lisa hardware mods to get those video signals. Also the screenshots at the beginning of the demo were done by me. The rest of the artwork and code was done by the other guys, I’m not our group’s best programmer.
Me: So what dev system did the coders use, Apple’s MPW?
LR: The ‘dev system’ was quite hackish. We used the LisaEm emulator for some testing stuff, did a lot of our build stuff on Linux using GCC and some custom stuff one of the guys wrote. He didn’t want to bang his head against the Lisa dev tools, so he learned enough to figure out the executable format.
Me: What was this written in, 68000 assembly?
LR: Yes, mostly, but some things were done in C.
Me: How did the coding go? Was it hard to get the desired performance out of it?
LR: Yeah… I didn’t write code for this, but the guys who did had a lot of challenges. For instance, the incredibly slow plasma effect? It was supposed to look better than that.
After my Q&A with Lincoln, he put me in touch with another CMUCC member, Andrew Wesie, who was the lead programmer on the demo. Andrew was able to detail the overall process and some of the particular challenges of wrangling scenedemo performance out of the first consumer-oriented personal computer with a GUI and a high-resolution, bitmapped display.
Me: Andrew, Lincoln indicated you could provide some insight regarding the coding of the demo. What can you tell me about your role and the dev process?
Andrew Wesie: To give a little bit of background of my involvement, I was responsible for the overall programming; combining some of the effects written by others and the audio. I also used the tools available on the Lisa to reverse engineer the program format enough to do all the coding in Linux using gcc.
Me: Was coding the Lisa a bit of an undiscovered country? Similar to 68K Mac dev?
AW: I had never coded on 68K before, the Lisa was my first adventure in that space. As someone in the cybersecurity field, I have looked at x86 and MIPS assembly, so learning 68K was pretty easy. Also I have written bare metal software before, so living without an OS [was] natural. The hardest parts are figuring out how to interact with the hardware and to make sure the OS doesn’t get in the way.
Me: How did you know how to code it — Apple docs?
AW: To interact with the hardware, the Lisa manuals were wonderful. It told you how to talk to the video circuitry, where the timers were in memory, which chips were used (so you can lookup datasheets), etc. Also there was commented assembly for some parts of the Lisa which proved useful.
Me: Is the Lisa a bit of a mess for an asm target, as I expect?
AW: The OS was a bit clunky to work with. I wrote a simple Pascal program to convert hex to binary, so that we could transfer programs over serial to the Lisa. Loading a program larger than one page of memory took a while to figure out, since it only loads in segments of programs as they are necessary, so the trick was to put a piece of code that we can jump to in every segment. To avoid the OS from messing stuff up, and to get past memory protections, I used a timer syscall to call a function that would turn off interrupts and run the demo.
Another thing to mention about coding is that the Lisa Emulator was quite valuable to reduce the time it took to test new code. At some point though, I think it was when we did audio, the emulator stopped liking the demo as much.
Me: Is the bus insanely slow, or does it match the 5MHz CPU pretty well?
AW: The CPU speed is only sort of 5 MHz and the bus is insanely slow. If you look at the Apple docs, I believe it says that the CPU can process at max 1 instruction every 4 cycles, so you can get at best 1.25 MIPS. Some instructions end up being more than 4 cycles so you on average get less than that. Since the video memory (framebuffer) is just regular memory, speeds there aren’t too bad but the high resolution makes it difficult to do full screen effects (since CPU is only 1.25 MIPS). Interacting with the timers to get sound is fairly slow, not to mention very limited.
Me: How long did you spend on the code overall?
AW: I think we got everything done in about 2 months, and coding was spread out over that time. I remember spending most of PixelJam finishing the coding.
As I mentioned, these interviews took place four years ago. In that time, CMUCC has gone on to release a second scenedemo for the Apple Lisa, Introducing the iLisa, which won 1st place at the lord of all demo competitions, Assembly, in 2012 in Finland (category: “wild demo”). The group also released one other production around this time, Where Have All The Pixels Gone? for the Vectrex console, which won 1st place at Pixel Jam 2012 (category: “combined demo”).