My Windows display driver starts to work a bit better now. And now I know that it is the keyboard driver that stops working without a graphics mode (???)
@leean00 no, this one was Debugme.bat
This is the contents of a micro Windows 2.03 install with debug kernel, as displayed by the proto-Progman. I guess I should have shared the comparison screenshot, but I have bigger things to fix at the moment
@leean00 here how it looks with a stripped-down EGAmono driver
@nina_kali_nina I have nothing meaningful to contribute, but I have to say I really enjoy following along while you're for some reason implementing a Windows 2.0 graphics driver in 2024
@zerodogg Thanks. I think I owe you a backstory! A while ago me and my fiancee randomly got a British computer called "Apricot PC".
It is a computer build to the Intel specification of what 8086-based computers should be. Because of that, it is incompatible with IBM PC in major ways. It even has three processors: 8086, 8087, and 8089!
Being an early 8086 machine, it runs an OEM version of MS-DOS. Most of the MS-DOS soft doesn't work; i.e. the best editor you can find for it is Turbo Pascal 2.0.
A few years ago, by some miracle, someone found a Windows 1.03 port for this computer. It works quite well, but Windows 1.x did not have Word and Excel, so it is mostly useful for Minesweeper. Windows 2.x was released for some OEMs, but not this one. I want to fix this injustice.
One curious thing about Apricot PC and its incompatibility with IBM PC is how graphics was implemented. You can see that it has a stunning 800x400 nine inch display (100 DPI, vs 33 DPI for IBM CGA).
It uses Motorola 6845 as its main display chip (just like CGA and EGA), but the chip is wired in the most boring way possible: it can only work in the "text" mode, just like Motorola planned. There are no clever hacks that create a linear framebuffer like on CGA or multi-plane framebuffer lin on EGA. Instead, "graphics" mode is just CRTC configured for 50x25 text mode with custom 16x16 "font" with 2048 character "font table".
This means your line drawing algorithms and your bit-blitting functions must do some super complicated arithmetic to convert (X,Y) pixel coordinates into (X1,Y1) 16x16 segment of the screen, then find the (word-wide) row X2 and then find the column bit Y2.
But when it works, it's stunning, for a computer from 1983.
@nina_kali_nina @zerodogg IIRC the Apricot PC was a particular elegant computer, with a small LCD display on the keyboard. I wonder if my adventure games would work on it, as they only use DOS and the text mode.
@davbucci @zerodogg The keyboard has a controller and a built-in LCD screen running its own thing - a calculator app, a clock, etc; five of the F-keys are re-programmable - Windows uses those for "Close window", "Move window" etc.
As for your games, I grabbed a copy of Silk Dust
* Silk.exe, as expected, doesn't work, because graphics
* Silkn.exe works, even without an IBM PC BIOS emulator.
@nina_kali_nina Haha, I love it, what a weird piece of hardware. It's important to do something with injustices when you can!
@zerodogg a curious thing is that back at the days it was "IBM PC" that was weird, mixing and matching things that were not meant to be used with 8086 :D
But then, Apricot has a 3.5" floppy drive, which was definitely not a common thing at the time (Macintosh wasn't released just yet).
@nina_kali_nina @zerodogg I tried to look it up but don't entirely understand what the 8089 did exactly. Some sort of acceleration of I/O handling? Is it sort of like a FSB or something?
@nazokiyoubinbou It was similar to "Channel I/O" in mainframes, but really it was a bona-fide processor with its own assembly. The idea was that a bunch of CPU cycles were wasted on copying memory between the devices; you could offload this to 8089, leaving all the true computing to 8086. Multi-processor system by design! 8089 could be configured to run on top of a separate bus, with some extra logic to allow the communication between the system bus and the external bus, too.
Intel's own manual shows that it can be used to move data from a shadow frame buffer to a video RAM at 1.25 megabytes/sec (that'd be 20 FPS for this computer) all while leaving the main CPU available for other calculations.
Now that I think about it, I probably can/should leverage 8089 to do conversion between a linear framebuffer Windows expect and 16x16 blocky mess of a graphics the computer has.
@nina_kali_nina @zerodogg it also has one of the first 3.5” HDD, which was also made in Scotland (sic!)
and oh how noisy and loud it is
@nina_kali_nina Would that be sort of akin to a modern uncore?
@nina_kali_nina @zerodogg thank you. Can I add the screenshot on the list of supported platforms?
@davbucci Absolutely! I also suspect it will work just fine on other non-IBM computers, like Victor 9000 and DEC Rainbow as long as they run a DOS version compatible with your game (I have no idea if your game would work on DOS 1.x or 2.x, but if it would, then it would do so on any non-IBM PC)
@yottatsa @nina_kali_nina I mean, obviously this beast deserves Windows 2.0.
It's kinda fascinating that they decided "nah, let's not do IBM PC compatibility", and then on 80's hardware also went "we'll just emulate it".
@zerodogg and the impressive part is that a big chunk of this "emulation" works xD truly incredible
@nina_kali_nina 😆 I do get the distinct impression that a lot of old tech was mostly held together by ducktape and hope.
@zerodogg it's even worse with modern tech. Docker compose, yarn install and pray that you won't get any malware installed
@nina_kali_nina @zerodogg @nazokiyoubinbou it’s used for heavy I/O, DMA, and memory copying mostly. It doesn’t have 8237 DMA controller as IBM PC.
@yottatsa @nina_kali_nina @zerodogg Yeah, for the time period this sounds like a pretty good idea. Though I imagine the whole setup was costly. Especially throwing in the 8087 like that too.
@nina_kali_nina @nazokiyoubinbou @zerodogg my bad googling says it was USD200 for 8089 and USD150 for 8087 at around that time
also, the computer is using 8086 instead of 8088 as in PC, which was(is) somewhat rarer.
@yottatsa @nina_kali_nina @zerodogg Yeah, I'm wondering here if IBM's decision was basically just a cost saver. The 8086 was more expensive than the 8088 and the 8087 + 8089 adds even more. And that's just the processors alone...
@nazokiyoubinbou @yottatsa @zerodogg I'm pretty sure it was, yes. And then they made lots of money by having IBM branding :D
@nazokiyoubinbou @nina_kali_nina Unfortunately, as implemented in the Apricot, you can’t run them simultaneously. The 8089 has two buses it can work with, but the secondary bus only connects to I/O space. So the program the 8089 runs has to be in main RAM on the same bus as the CPU and they can’t both access it at the same time.
I suppose theoretically you could put some RAM in I/O space to fix that, but it wouldn’t be very portable. (Then again, Nina and Atsuko are the only other people I know who have one, so maybe it’s viable. 😄)
@bytex64 @nazokiyoubinbou and the RAM isn't especially fast there, unfortunately - but I do suspect that even in this configuration it might give a bit of a speed up in certain situations
@nazokiyoubinbou @nina_kali_nina One of the other fun things inside the Apricot is a TI SN76489 Programmable Sound Generator (that's the one from the Master System, Tandy 1000, PCjr, etc.), which criminally is only used to implement the system beep and keyboard click. The 8089 has a mode where it only executes every 128 cycles ("Bus Load Limit"), and in this mode it could easily run the 76489, playing music without any CPU involvement at all.
Earlier this year I started work on an 8089 assembler, but I got distracted. I don't think one has survived for any 8086 platforms if they existed at all - Intel's ASM89 ran on their ISIS-II system on 8080. I'll probably pick it back up in April (for Aprilcot! The celebration of Apricot computers in the month of April that's definitely not just me! Tell your friends!).
@bytex64 @nazokiyoubinbou time to summon @brouhaha with https://github.com/brouhaha/i89
@nina_kali_nina @nazokiyoubinbou @brouhaha Oh right, I did use that. 😅 But it’s mainly designed for standalone systems. I’m trying to make something that produces OMF objects, so you can link it into an executable and share symbols.
@brouhaha @bytex64 @nazokiyoubinbou and this is an incredible work as it is. Thank you!