If you need ten pages to explain your project and even after I read your description, I'm still left confused why I need it at all, then maybe... I don't need it?
It's due to the way the instruction is encoded. `lea` would've needed special treatment in syntax to remove the brackets.
In `op reg1, reg2`, the two registers are encoded as 3 bits each the ModRM byte which follows the opcode. Obviously, we can't fit 3 registers in the ModRM byte because it's only 8-bits.
In `op reg1, [reg2 + reg3]`, reg1 is encoded in the ModRM byte. The 3 bits that were previously used for reg2 are instead `0b100`, which indicates a SIB byte follows the ModRM byte. The SIB (Scale-Index-Base) byte uses 3 bits each for reg2 and reg3 as the base and index registers.
In any other instruction, the SIB byte is used for addressing, so syntax of `lea` is consistent with the way it is encoded.
When you encode an x86 instruction, your operands amount to either a register name, a memory operand, or an immediate (of several slightly different flavors). I'm no great connoisseur of ISAs, but I believe this basic trichotomy is fairly universal for ISAs. The operands of an LEA instruction are the destination register and a memory operand [1]. LEA happens to be the unique instruction where the memory operand is not dereferenced in some fashion in the course of execution; it doesn't make a lot of sense to create an entirely new syntax that works only for a single instruction.
[1] On a hardware level, the ModR/M encoding of most x86 instructions allows you to specify a register operand and either a memory or a register operand. The LEA instruction only allows a register and a memory operand to be specified; if you try to use a register and register operand, it is instead decoded as an illegal instruction.
> LEA happens to be the unique instruction where the memory operand is not dereferenced
Not quite unique: the now-deprecated Intel MPX instructions had similar semantics, e.g. BNDCU or BNDMK. BNDLDX/BNDSTX are even weirder as they don't compute the address as specified but treat the index part of the memory operand separately.
The way I rationalize it is that you're getting the address of something. A raw address isn't what you want the address of, so you're doing something like &(*(rdi+rsi)).
LEA stands for Load Effective Address, so the syntax is as-if you're doing a memory access, but you are just getting the calculated address, not reading or writing to that address.
LEA would normally be used for things like calculating address of an array element, or doing pointer math.
> Why use the exponent sign to indicate the upper limit?
The caret is used to indicate the upper limit for the same reason some programming languages use it as the exponentiation operator (other programming languages may use something else, like **, neither is normally how exponentiation is “normally”, outside of programming and its historic limitation to ASCII characters, indicated), because its upward-pointing character is a considered a way of suggesting that the following number should be thought of as presented raised from the normal baseline, which is (in somewhat different ways) true of both exponents and upper limits in summation (this is the mirroelr image of why _ is used for the lower limit.)
That's because the positioning of n is similar to that of exponent? As author says, this is more about expressing "visual rendering" using the text. Hence the term "ASCII math" like in ASCII Art.
Very similar idea as Bruno: everything is configured in text, which I always find myself more productive in that full blown GUI where I need to tab from edit text to edit text to get anything don.
The Amiga with its revolutionary coprocessors (Denise, Agnus, Paula) was a game changer.
However, id Software taught the world that the important part of the software revolution was not in these coprocessors but in fast 3D code. And the Amiga architecture was simply not ready to face that challenge.
Now to be fair, I don't think DOOM killed the Amiga, but I am reasonably confident Windows 95 did.
Isn't this the classic Wheel-of-Reincarnation cycle?
The Amiga's dedicated coprocessors let it do things an IBM AT or Mac 512k couldn't do easily.
But the AT evolved into the 386 and 486, and suddenly you no longer needed coprocessors to do the heavy lifting.
So you end up with two issues:
1. Did the Amiga market keep pace CPU-wise? I can see the 3000 and 4000 shipped with better CPUs, but how much was the software ecosystem defined by the 500/600/1200? I wonder if this created a tarpit for developers: the Amiga might have been able to do the same tricks as a faster PC, but the development effort was much higher than just recompiling brute-force x86 code to 68000.
2. Where were the custom coprocessors going? The original Amiga chipset was good, and I'm sure the ECS/AGA stuff was even better, but it seems like it was close to "at par" to a PC with a SoundBlaster and a basic Super-VGA chipset by the early 1990s. What could they use to create a new defensible territory? I could imagine a 3-D coprocessor, but designing something that both impressed today and didn't immediately obsolesce is hard-- see the corpses of a dozen early accelerator vendors.
The Amiga stagnated for years. The custom chips barely changed from 1985 until late 1992. ECS barely added anything over OCS (it had a couple new video modes nobody used.) As I mentioned in another thread, AGA was too little, too late. By the time AGA was out, 386 systems with SVGA and SoundBlasters were cheap and plentiful.
Yes, if Commodore wanted to compete, the Amiga chips needed to keep up with Moore's Law. Every 18 to 24 months, the processing and display power should double or more. In 1988, the Amiga 500 and 2000 should have had something like AGA.
The article correctly points out that 68k chips fell behind the x86 series. Yes, chunky pixels are useful, and the display hardware should support them. But if the game requires the CPU to draw each pixel one by one, the hardware has already failed. Instead, there should be some specialized processors that do the work. In fact, that's what happened in PCs. Doing all of the drawing with the CPU lasted only about five to seven years, then everyone had GPUs.
Yes! What killed the Amiga was that Commodore effectively stopped R&D on the chips in the mid-1980’s, and by the time they restarted they had already lost too much time.
(It’s possible to argue Steve Jobs was right when he dismissed the Amiga because it was too much hardware. He knew it would be difficult to keep evolving such an architecture. It’s also possible he was wrong because he didn’t account for Commodore’s chip design and manufacturing processes.)
In any case, by 1992, there were Macs capable of 24-bit color, and the 68040 was certainly capable of pushing enough pixels quickly to run Doom / Marathon / Duke 3D without hardware acceleration.
I remember someone saying "There was no market for 'Amiga games', there was a market for Amiga 500 games." If you look at it that way, it had a nice long run as a game machine, while failing in most other segments.
That makes total sense. The A500 was the most popular machine, with a 68000, 512K RAM. If your game required 1 meg, you'd lose a decent number of potential customers. (My A500, circa 1989, was souped up! 3 megs of RAM and a hard drive!)
Meanwhile, there was a market for high-end PC games. If you bought a better PC, you could show it off. That was important to the type of person who owned an Amiga (they always wanted to show you).
The early "3D coprocessors" weren't very 3D at all, they basically accelerated triangle rendering in screen pixel coordinates. So "2.5D" at most. I could definitely see some version of the Amiga shipping with something like that, leading to something not too unlike Sony-PS1 level graphics or so. But the real problem for the Amiga (and its nearest competitor, the Atari ST/TT) was that the 68k architecture was ultimately abandoned by Motorola, and at the time (with Moore's law in full swing, and thermal constraints not too important just yet) the PowerPC looked like the best alternative. Of course ARM was a thing already, and it even got used in a high-level game system (the 3DO). So you could surmise that we could've gotten an ARM based Amiga/Archimedes mix instead which would've kept some kind of "cheap home computer" market going for some time, trying to disrupt the costly PC and Apple Mac platforms at the low end.
I would say that hardware which helps rasterize triangles in screen coordinates is squarely 3D if it does Z-buffering (or any other hidden pixel removal).
The 68k architecture still had some runway by the time Commodore’s fate was sealed, though. Commodore really needed to be taping out the next generation chipset no later than 1990, and arguably 1988 would have been better.
I wonder if Motorola lost faith in the 68k because of a diminishing number of signature customers.
When Apple went for the PowerPC instead of continuing on to the 68060, the remaining audiences for high-end 68k were not going to move anywhere the same numbers.
If Commodore and Atari had remained competitive longer, there might have been more demand (and conversely, enough R&D effort to tide 68k over until we got to modern "everything is RISC after the decode stages" design paradigms.
Stupid question: why didn't they replace the 6800 with the newer Intel chips? Was it "just" (huge understatement, I know) because they'd need to port or break software due to the new arch? I guess my question is if their multi coprocessor architecture could've worked with a better, stronger main processor like Intel's?
Commodore did have a line of IBM PC clones if that's what you're suggesting. Adopting x86 without the other elements of the PC architecture would've been pointless. The 386 would've made programming just about bearable for devs used to the m68k's flat address space and elegant from-scratch ISA design. It would not have been very successful.
When Commodore died, Motorola's 680x0 series of CPUs were still competitive with Intel's x86, but Commodore would use the previous generation (or two) model to keep the price low. If you wanted the latest, fastest CPU, you had to buy an accelerator (eg. GVP). They should have had the top of the line processors available for those who needed or wanted them and could afford them.
A few years later, Motorola was failing to keep up, which is why Apple switched the Mac to PowerPC.
> Commodore would use the previous generation (or two)
Reason for picking 68000 was bargaining power you had when owning a chip Fab.
"Live with Dave Haynie - Commodore Business Machines C128, Plus4, Amiga" - BilHerd https://www.youtube.com/watch?v=6ZT209i-3Lo Dave reveals Commodore was paying $2.5 per 68000 Hitachi CPU compared to Apple $8 from Motorola.
I really think that the Amiga 3000 killed the Amiga. When it was released, instead of move another decade ahead graphics wise, it was 3-4 years behind. 640x512x16colors?!? The 3000 needed to be released with AGA, on a card.
Comparisons. In October 1990.
For $3999, you had the Amiga 3000 with 640x512x16colors. It did have a '030 25Mhz, 40mb Hard drive and 2mb. Monitor was extra.
For $3860, you could get a 386/33, 4mb, 512k SVGA (800x600 8bit), 160MB Hard drive AND you got the 14" Monitor.
The Amiga was dead a few years before Win95. The VGA chipset and the soundblaster killed the Amiga.
In the late 1980s I wanted an Amiga so badly. But by the early 90s I had a 486 with VGA and a sb16 and it was all over. The Amiga had a mere fraction of the PC's power by then.
Indeed. Commodore killed itself and Amiga, and breakneck speed of PC advancement didn't help. Doom was just a side effect of it all. It left a trauma on Amiga community though, you can still see the community mentioning Doom to this day (see, it can run Doom?). Doom-envy is omnipresent to this day. There was another, rarely mentioned, which is nintendo-envy. NES and then SNES in particular had killer games where Amiga never came through in such capacity (platformers, it shined in other genres). Amiga was poor man's SGI at the time. It was great, fun, relatively cheap for what it offered. It could've been so much more if Commodore had a sense of direction and focus. Alas, here we are lamenting decades after on its fate.
Cachet it left is still strong. I recently (over few years back) tried to get ahold of Amiga again. I just wanted one endgame A1200.. now I have three A1200 - one with Blizzard 1230/030+fpu which is the best general purpose IMO, one with Blizzard 1260/060rev6 for demos (not that great compatibility for general purpose), and one with TF1260/060rev6.. and then two A600 (one stock, one with Furia030) and A500+, indivision addins etc., and a whole bunch of Commodore 1084s monitors. It was supposed to be only one A1200, damnit. Take it as a warning from a friend if you want to get one, they multiply fast.
Nah. The Amiga 1200 debuted in 1992 for $600. 2MB RAM and a 14Mhz 68020. No monitor.
In 1992 you could get a 486dx 33MHz with 4MB for like $800 (a two year old chip) with similar peripherals. Way more than double the power for a marginal increase in cost.
The Pentium arrived a year later in 1993 and by 1994 we had the 486dx4/100MHz and Pentium/586 at similar clock speeds. This is around when doom arrived and Amiga was long since toast.
The combination of cheaper, faster processors and a market that wanted to constantly upgrade is what killed the Amiga.
As mentioned by an earlier commentor, who your market was really counted back then. Businesses have upgrade cycles every 3 years as they right off liabilities. That subsidises the next generation of improvements.
Even if Commodore had access a single generation capable of fast 3D code it wouldn't have saved them.
For example, the Acorn Archimedes had access to much better processors[1] and a chunky 256 colour mode at launch in '87.
That still didn't save Acorn[1] because they lacked those business upgrade cycles sales.
I don't think an '87 Archimedes had enough umph to run Doom back then but it could have run something Doom like that the Amiga A500 couldn't if really pushed.
It still wouldn't have interested the business users that really made the PC successful and by '93 Intel would still have been ahead.
1. 12Mhz 32 bit ARM2 was at least 7x faster than an 8MHz 68000 in 1987 and twice as fast as a 16MHz 386 using Drystone.
I couldn't resist :) this video is doom 2, 12mhz arm but running on a vga monitor (crt is faster), so I'm guessing the frame rate would be about this or a little faster on an 8mhz arm.
https://youtu.be/jXo6BtmuZkc
Living in central Europe I didnt even know at the time a company named Acorn existed despite reading tons of computer literature.
Acorn had no retail/dealer network, no marketing and zero presence outside Commonwealth (UK and I think Australia), or even narrower outside UK/AU educational market buying Acorns due to BBC Micro Computer Literacy Project program inertia. Acorn was run so bad they didnt even have money to pay BBC for this absolutely fantastic and cheap marketing! Part of the reason for sale to Olivetti was to gain retail channels and pay back BBC royalties. In the end BBC was forced to write off last couple payments.
Windows 95 also came close to killing the Mac. Before then the difference between Windows and MacOS was so striking that it was obvious that anyone who really wanted a GUI interface would go for a Mac. But then the advantage became much less strong. If Jobs hadn't come back and brought NextStep, which becane OSX, I think the Mac would have gone the way of the Amiga.
The reason why Apple didn't go bankrupt is not because Jobs brought NeXTStep, but because Bill Gates gave them money months away from having to shut down.
That was certainly important in the the short term (obviously Gates wasn't doing that to be nice but because having Apple die would look bad in the then ongoing antitrust investigations into Microsoft; it's the same reason Google sends money to Firefox today -- having a competitor is a great defense against accusations of monopoly).
But Jobs didn't just run a marginal company but turned it into a company which is comparable in worth (and often worth more than) Microsoft.
TLDR: Apple was stealing with help of Intel, both companies scared by QuickTime positioning Apple as the leader in Multimedia (1991 Adobe Premiere build by ex Quicktime engineer on Mac platform, 1991 Avid ported from Apollo $workstations$ to Mac). When Jobs came back in 1996 he didnt like (or couldnt afford) all the litigation and promptly settled for $ and Microsoft support commitment (Office, IE) in exchange for letting MS save face.
"David Boies, attorney for the DoJ, noted that John Warden, for Microsoft, had omitted to quote part of a handwritten note by Fred Anderson, Apple's CFO, in which Anderson wrote that "the [QuickTime] patent dispute was resolved with cross-licence and significant payment to Apple." The payment was $150 million."
"Microsoft and Intel had been shocked to find that Apple's QuickTime product made digital video on Windows seem like continuous motion, and was far in advance of anything that either of them had, even in a planning stage. The speed was achieved by bypassing Windows' Graphics Display Interface and enabling the application to write directly to the video card. The result was a significant improvement over the choppy, 'slide-show' quality of Microsoft's own efforts. Apple's intention was to establish the driver as a standard for multimedia video imaging, so that Mac developers could sell their applications on the Windows and Mac platforms. Microsoft requested a free licence from Apple for QuickTime for Windows in June 1993, and was refused. In July 1993, the San Francisco Canyon Company entered into an agreement with Intel to deliver a program (codenamed Mario) that would enable Intel to accelerate Video for Windows' processing of video images."
"Intel gave this code to Microsoft as part of a joint development program called Display Control Interface."
"Canyon admitted that it had copied to Intel code developed for and assigned to Apple. In September 1994, Apple's software was distributed by Microsoft in its developer kits, and in Microsoft's Video for Windows version 1.1d."
It is also notable that the Mac was stagnating just like the Amiga around that time. The successor to OS 9 was delayed for years and years before being cancelled. The hardware was getting more expensive but with only small incremental improvements in speed or capacity. The Centris and early Performa lines were just so mediocre. The 68k architecture was stalling out just as Intel was blowing everyone's doors off with x86. Jobs made a bad bet with PPC, but it was still way better than 68k and gave them enough breathing room to keep up with PCs for a bit.
No, the 68k lineup was competitive or better with x86 for the entire duration of Amiga’s viable lifetime. Apple’s troubles came later, after they had already successfully transitioned to PowerPC.
Yeah the Power Macintosh line, as the name suggests, introduced PowerPC in 1994. There were Performa versions of those as well. Jobs’ return as CEO roughly coincided with the beginning of the G3 era (calling them that instead of PowerPC 740 is already pretty Jobs-ian).
If anything, id proved to the world that Intel killed and buried the 68000. But Commodore was in its death throes anyway, so I didn't think there's much of a lesson to learn here.
This looks like a much worse Wolfenstein clone... No 3D as far as I see (all rooms are of the same height, the map is just a 2d maze), no textures on floors on ceilings. Not to mention much lower resolution than wolf3d - the graphics are barely legible.
Wolf3D is just (a 3D view onto) a simple grid shape. So this is doing a bunch of stuff Wolf3D can't do. The linked video talks about (but does not demonstrate) full variable heights which would be even closer to Doom, but this is already able to provide a sense of space and visual variety absent from Wolf3D.
How revolutionary can something be if its impact is relatively limited, it was released for a style of game that was being supplanted by 3d, and all of its major titles were rereleased within years for commodity x86 hardware (due to the rapid advancements in cpu technology)?
The Amiga, as a general purpose computer, was a bright vision in a world that was already leaving it behind.
Most Amiga users - at least the home computing/gaming enthusiasts - never saw a big-box Amiga back in the day. To most users, 'The Amiga' was the A500, which released in '87 and reached peak popularity in the very late 80s/early 90s.
And to most Amiga gamers, Doom was where it became clear the PC had overtaken. Amiga game developers also got obsessed with trying to build a Doom clone for the Amiga, repeatedly showing how futile that effort was* , and much talent was wasted in that pursuit rather than making more good 2D games.
But by the time Doom arrived, the SNES had already been out for a couple of years, too, and despite a weak CPU it absolutely destroyed the Amiga in terms of 2D graphics performance (multi-layer parallax, loads of sprites, loads of colours, and 'Mode 7' effects)
(*ok, maybe not entirely futile if you've seen Dread/Grind which are super-impressive, but it took until the 2020s to pull it off, with an engine that seems about halfway between Wolf3D and Doom in terms of capabilities)
Eh. When I saw the Macintosh II demoed in 1987 compared to my Amiga 2000 which was released at basically the same time, I already knew the Amiga was screwed.
Looking at the UIs for early Macintosh vs. Amiga Workbench it feels like Amiga didn't put much thought into aesthetics. Just look at the color scheme they chose for the desktop UI:
It seems they had at least 64 colors to choose from and they went with... Orange and white on blue? They also decided icons should be able to be arbitrary sizes for some reason, a feature I haven't seen in other GUIs.
Here's workbench 2.0, which is better but still pretty unappealing.
That color scheme was allegedly picked for visibility on very low-end TV sets. (The Amiga palette colors could be chosen arbitrarily from 4-bits per channel RGB, so that wasn't a constraint.)
(The icons weren't just arbitrary sized, they could have different pictures for the selected and unselected state. You see this with the open vs. closed drawer icons (drawers are like "folders" in other OS's) but many app icons also used this effect.)
Be that as it may, if you look at C64 GEOS which ran on consumer TV sets and with comparatively primitive hardware, the UI looks so much nicer in GEOS to my eye. Even the Apple II GEOS makes Workbench look like a hot mess.
There weren't any 'default' icons, so the workbench didn't show you everything on the disk, only programs with ".info" files which had the icon. You had to drop to the command line to do most low level things. Of course, Workbench didn't really matter when you were playing games.
There was an option to show everything - with default icons. (Not very useful ones since notions of document ("project") file types were not used all that much, so every file showed as some sort of executable. But the option was absolutely there even though it wasn't the default.)
At the time though it didn't seem any worse than the Mac though, and it had colors - even if the defaults weren't great, any colors seemed better than none.
Or at least that was how tasteless teenage me saw it.
The Mac barely survived Wintel, so its not that apt of a comparison. If Microsoft didn't port Word/Excel over, I don't think we'd have many options now.
The Amiga got completely creamed by x86 and was a footnote by 1994, when it was cancelled. It's best titles were ported to other platforms or forgotten. The people who succeeded the strongest on it moved on to other platforms, but its architecture really didn't live on in any mainstream descendants.
If you need ten pages to explain your project and even after I read your description, I'm still left confused why I need it at all, then maybe... I don't need it?
reply