Pacmania for Archimedes was regarded as one of the best conversions of that game to the home computer platforms. I'm not saying that because I converted it, though I did. I am saying that because it really was better than anything else at the time. We also added special enhancements which took advantage of the superior speed of the CPU - The software rendered sprites in Pacmania were bigger than those of the Amiga, because of limits on its hardware sprites, the screen was pixel scrolled in software, using the Arm's barrel shifter instructions, in far less than a television frame and was much bigger than the ST which only scrolled half the screen because that's all we could do with the far less powerful 68000, on a machine with no hardware scrolling or sprites.Did you know that our earliest games (including Pacmania and Terramex) were written using the onboard Arm assembler contained in the Acorn's BASIC interpreter ? There was NO other machine on which something like that was remotely possible. Only later did we use a custom in house Cross assembler specially written by myself. to make conversions easier.We who worked at Krisalis Software back then were very proud of the Acorn conversions we did, and still are.Shaun.
Amiga users don't understand there was no need of a blitter because the Archimedes can blit transfers of data thanks to its architecture : after 1 register (32 bits) transfered, all subsequent registers (32 bits) take only 1 cycle (125 ns on an 8 Mhz machine) EACH ! Read the comments in my video here : th-cam.com/video/hU23fEL154k/w-d-xo.html PACMANIA : Amiga vs Acorn Archimedes. Which version is best ? It is a pity Amiga fanboys stay arrogant simply because : - they have no clue about the weakness of their beloved machines - they demonstrate their utmost ignorance of the ARM chipset and the brilliant design by UK Acorn for the Archimedes. Maximising the usage of the memory bandwith was at the heart of the ARM design. On the contrary, the Amiga used an outdated off the shelves 68000 technology from 1979, with a poor design as far as memory bandwith efficiency is concerned. The idea of the blitter chip is brilliant, but it is there to compensate the miserably slow 68000. Even a 65C816 does memory transfers faster. Add to this the barrel shifter with an immediate value costs ZERO cycle on the ARM (it costs an extra cycle on the 68000, or it doesn't even exist until the 68020 is released, I'll have to check). Furthermore the way the LDR instruction works on the Archies, rotating bits when the address isn't word aligned, gives the machine a great advantage to build very efficient fast plotting routines. With all instructions taking 4 bytes in memory, building 'engines' to create in realtime generated code to plot sprites is made much easier (and way faster) than on any other systems. This is why the game Scorpius is so amazing and people can't believe it works at 25 frames per second on an 8 Mhz machine : it keeps generating tailored ARM code sprites plotting routines to plot the sprites, with an efficient memory manager to free the unused sprites and generate the assembly code for what will have to be displayed next. th-cam.com/video/t4hfPKWM4Mk/w-d-xo.html This is exactly what I have been creating (without knowing they did that for Scorpius) because there is no other way to fast plot sprites on the Archies, but my routines are even more optimised, and they can work with RasterMan hardware shifting the start of each horizontal scanline, or mirroring the screen, or whatever DMA video memory redirection and VIDC horizontal start offset is carried out per scanline. Watch this : th-cam.com/video/K44V7ps_tfY/w-d-xo.html : the Amiga just can't compete ! This is a 320x256, 256 colour screenmode at 50 Hz and any pixel of the sprite can be any colour 0 to 255 : a crippled machine like the Amiga will never ever be able to do it. Its copros are weak and full of limitations : for a computer supposed to have the power of a console, it is a pity. Look at the Japanese machines : they were clearly superior technical achievements.
Do not forget IIRC Pacmania works on a 512 kbyte Archie : the A305 (fortunately discontinued after 6 months : 1 Mbyte RAM became standard). With 512 kbytes, It is not possible to have the graphics pre shifted in memory (on offset 0, 1, 2, 3 from a word boundary, ie an address multiple of 4 bytes), it is why Pacmania does the loading, shifting, gathering and storing all this in realtime, for each frame. To me for 2D games you have to use fast generated code, these routines rely on the existence in memory of preshifted sprites to plot these sprites ; so you need a 2 Mbyte machine, if your game is to have animated sprites, musics and sound effects. Here the architecture of the Amiga is more efficient as far as memory needs are concerned.
Pre-shifting really was totally unnecessary on ARM, so I don't see that the Amiga would be more efficient than the ARM was in this respect.There was still plenty of time left over at the end of the screen raster after everything was done. Don't forget too, all the sound samples were being read, scaled, and shifted into the sound buffer during this raster period too! When masking onto the screen, one had to deal with up to an extra 7 pixels (4 bits per pixel mode, in a 32 bit aligned access) even if one had pre-shifted data, but the shifting was implicit in the MOV, AND, and OR instructions, and one was always working on a 32 bit aligned boundary with the multiple load and store instructions. The shifts themselves took zero cycles, as they were always part of the machine instruction, even when no shift was required where one would just leave out the shifting operand in the instruction source. In any case a logical operation or MOV shift or otherwise would execute in a single cycle. Pre-shifting graphics gave little advantage on the machine. We didn't store any form of graphic mask either. That too was generated on the fly pixel by pixel (0000 being transparent) once the whole sprite width was shifted across a few registers. It was amazing just how fast this thing ran back in the day. 512k RAM - Yes that was with the OS still in place and running too IIRC. It was the only machine where at the time we didn't completely trash the OS, and run on our own code to drive the hardware. Our later games always had a clean return to the desktop. Nowadays nothing less would be expected!
Yes in the case of Pacmania preshifting isn't necessary because you can do it 'on demand, in realtime' and still run at 50 fps. It is ok because you used an 4 bit per pixel screenmode, and not an 8 bit per pixel one like mode 13. You end up with sthing very similar to the Amiga. Also the memory constraint (512 kbytes to have everything running) makes the use of preshifting quite impossible. For other games, to fully exploit the Archimedes capabilities, it is necessary. Have you seen this, Sir ? th-cam.com/video/t4hfPKWM4Mk/w-d-xo.html Scorpius game demo It can't be achieved with shifting in realtime : you must use preshifted graphics and generated code. It was to have music too, and it runs at 25 fps, screenmode is 320 x 220 , 256 colours. On a 1 Mbyte machine, the usage of preshifting would have made Pacmania run in a 256 colour screenmode, overscan 384x288 at 50 fps. Now with the existence of RasterMan allowing greater usage of the hardware features of the Archimedes, it would be even simpler to achieve : it makes the Archimedes look like an Amiga for many of its copper and blitter operations. I wonder if the business was very good for you converting games to the Archies. To me I believe so : it must have been very profitable because you had no competitors and piracy wasn't important on the Acorn machines. A very smart move, from a business point of view ;-) Bravo !
Thanks for this. I was proud to be part of the team that brought the first Archimedes to market. An amazing time in British technology development and one that has left an incredible legacy.
Definitely. You should be very proud. For years I kept in touch with Sue Wall, who looked after all the ISVs (Independent Software Vendors) but I've since lost touch with her. If you have any details I'd be grateful. I always remember well, David Lowdell, who always took an interest in what we were up to.
I remember the Archimedes quite well. ARM chips etc. My school had some of them and, for a short time, my brother and I were obsessed with them and wanted to save up our pocket money to get one. Of course, that lasted about two weeks before we lost interest, as kids tend to do.
Also was the first full 32-bit operating system with a WIMP interface. Think how long it took Windows to become 32-bit! Windows 3.1 came out in 1992, regarded as the first useable version of Windows, RISC OS came out in 1987 and instantly was useable and powerful. I have to admit, even though I knew of the ARM250 I didn't realise it was the first system-on-chip! Impressive. Sofie Wilson and Steve Furber deserve more recognition for their work on the ARM.
Great video, I'm sitting watching it with my Acorn A4000 set up on the desk next to me. I rescued it when I was at school, it was getting thrown out because it had been replaced with a classroom full of shiny new Siemens PCs. All these years later it's still working well, now with flash based storage.
Similar story here. I've got three Risc PCs under my desk, including a two slicer! I bought a 3010 when I was in University in 92. Damn, the things I did on that machine. Wonderful.
I played Zarch a lot on my dad's Archimedes. I played Conqueror too, a tank game with similar graphics that I think shared much of its development with Zarch. Awesome, incredible game. So far ahead of its time.
Well, the worlds population of primates is around 7.5 billion, so that ca. 15 billion arms. By now there are 180 billion ARMs. That's 12 times as many ARMs as arms.
My secondary school was full of these computers (we had the bbc micros in primary school), They had a huge network of them with every student getting a user account. You could log onto one in the library and access your documents you had been working on in the IT room only moments before it blew my mind as i had a c64 at home and had never experienced networking of any kind. Glad i got to spend a bit of time with the machines, asoon as we left school everyone was switching to pc/windows and the acorn died off pretty fast, ive never seen or used one since.
The ARM is still the best desktop CPU in my opinion: My first thirty-two bit system was an Archimedes A310. Today my primary computer now is a Raspberry Pi B+ running RISC OS 5.24 (A system with many of the same people behind it as the original A310, in modern times). I only use a linux system (Raspberry Pi 3B) for heavy web browsing (js intense pages, or video eg youtube).
everyday use: a 1994 (23 years old) RiscPC. replaced CMOS battery. Swith off only to try or install another SCSI card. Best computer investment ever !!!
My Dad bought an Archimedes (1994ish?) for some kind of music editing software I believe. But what it ended up being was at the age of 5 this became where I got a huge love of games. We already had an Atari in the house, but I found it too slow and unreliable. I still cherish Lemmings (even now, my favourite game ever!), and tons of other stuff like Pacmania and Twinworld and a large number of kids' educational software for English, Maths, Science, IT. Even when I got my Playstation a couple of years later, I still used the Acorn for many years. I'm 28 now, and still singing its praises to everyone who'll listen!
Your Dad certainly bought Sibelius, the highly acclaimed music composition software, with over 20 international Awards for its excellence. Check with Google. www.avid.com/en/sibelius This is again one of these killer apps we had on the Archies, and so few people know about.
If you are still good with your Archimedes LemmingSplat why not try splatting a few Nazis in Wolfenstein? I never tried to splat a Lemming but the odd GrupenFuhrer can be very satisfying. I will go along with you on the educational software side though, I believe !Numerator was the best educational software I have seen. I would change your on board battery if you have the A7000, in fairly short order!
I had an A4000 (and an A3000 before and a Risc PC 600 after that). It's been great watching the other OSes catch up - Mac OS X adding the 'Dock' (icon bar...) and context-sensitive menus becoming more and more de-rigur. If only Acorn had won 'marketing' as well.
I used to write for Arc World and Acorn User -. I loved the BBC micro and onwards. I'm looking foir the start up flute like bong of my first A305. Has anyone got it ? The first Aurtur upgrade lost the bong sadly. It was magic !
I had the BBC and followed Arm all through my electronic career, never bought one (until the RP). I looked at Arm for home based consumer tech and it wasn't competitive at the time (late 90s early 00s I think). We were looking at building digital recording for analogue TV signals, prior to DTV broadcast. Most stuff was running off ST processors with built in MPEG decode hardware, the arm could do it in SW, but still needed a hardware chip to do encoding. It was only with mobile electronics it really hit home with its low power consumption. That was really an accident by Acorn because they were just looking to save money on the package and go with plastic. The last company I worked for did security camera design and the market leader for low power was gopro (arm based). They wouldn't license and we had to use a more standard Arm part and suffer the increased power dissipation in the camera, this reduces the life and increases complexity and size. Processing efficiency is now a massive part of the game. The more processing and less heat you can put into a camera the more features it can have like object recognition, left objects, people tracking etc.
I missed out on the Archemedies... I had just discovered our college's second-hand PDP-11 (the previous most important computer in the world) By the time I got back into micros, everyone had Amigas. Huge hole in my cultural upbringing... took me 30 years to discover the joys of ARM.
No, Western Design Centre. They were a second source for MOS chips. While MOS were tied up making chips for the C64, WDC were selling to Apple, Acorn, and Nintendo.
Ace video! Never heard of? I remember the Archimedes very well - especially Zarch and the Lander demo. One of my schoolmates said that if he had one, he'd have permanently bunked off school and stayed at home programming it instead. :D
We had the a4000 at school back in early 2000. I finished secondary school in 2005 and they was replaced shortly before i left. I remember putting my hand on the monitor and ZAPPING my mates with the other 😂.
PS. The 3 button mouse WITH *MIDDLE* "menu" BUTTON - best WIMP UI interactive device... - MS Windows ; each window needs individual real-estate - Apple macOS : [dreaded] top of screen
Another UK computer genius: Geoffrey Dummer from Hull, UK. See his Wiki page. Who would have thought that a Yorkshire man should be one of the prime movers in computer technology. He should be better known.
We even had the "Computers for Schools" scheme ran by Tesco. I have an old VHS video of a family member being presented with an Archimedes by a Tesco store manager.
One must ask, when it comes to the graphics support, what about the Amiga? Also in terms of pricing, the Amiga (and indeed the Ataris) were much cheaper. I think the A3010 was always more expensive, despite having a much similar spec. The Ataris had off the shelf components in the chipset, the Amigas had highly custom chipsets (OCS/ECS/AGA) and the Acorn did SoC. That said, the Archimedes were still groundbreaking. I recall reading that Acorn were way ahead of the time with their font support; they had anti-aliasing figured out in RiscOS 3, and it always looked beautiful for typography. I remember writing things in Pendown+ and marvelling at how clear everything looked on screen. It took until Windows 98 I think it was, for Microsoft to catch up. That technology was one of the major reasons Pace gobbled up the remnants of Acorn, as they incorporated a lot of how RiscOS displayed things on screen into the set top boxes. Probably still little bits of it kicking around in STB's today!
David Rickard For certain types of games (platformers, shmups) the Amiga was king. The Amiga custom chip set was specifically designed with these types of games in mind and its coprocessors served to allieviate the burden on the CPU. Amiga ports to the Arch were usually serviceable, but still inferior to the original (no copper to produce raster bars for instance). However the ARM processor packed more grunt than the Amiga's 68000, so the Arch had an edge when it came to 3D polygonal games. The Amiga chip set simply wasn't designed with these types of games in mind.
Had RasterMan been created 30 years ago all games would have been much better on the Archimedes since RasterMan mimics 90% of the hardware facilities offered by the copper and the blitter chip + extra features these Amiga chips can't offer.
I'm still not sure how he got round the latencies on the timer interrupts here. I tried everything to keep the switching point(s) stationary. Perhaps if we'd just taken complete control of all the hardware, and suspended the OS completely as we did on every other platform it might have been possible, but it would certainly have been frowned upon I think!
Shaun Hollingworth This is what Steve did. He had to write his own keyboard handler too, and to modify his QTM sound routines, in a way the IOC timer, when it reaches zero, triggers an interrupt which is first serviced. At least, it is what I understood. The author of the cMEMC demo, a German Acorn enthousiast, Marko Lukat (aka BlackIce) (see on my YT channel th-cam.com/video/LeTlgOfLi40/w-d-xo.html ) did that too, and he was 1st. It is because I saw this demo that I contacted Steve to further develop the idea, he agreed to come back programming for the Archies, and blessed us with some other great pieces of code like LCDGamesModes and VIDCblaster. He was also involved in the software preservation project by Johnathan Abbott, as a 'VIDC master' consultant and to provide a version of his QTM tracker able to replace (and obey the commands from) your Krisalis' sound routines. Join us on the stardot forum if you want to know more.
No the Amiga and the Archies had not similar specs : the Archies could natively display hi res screen modes with 256 colours, non-interlaced. Even the sound has no comparison : 8 channels log for the Archies with 7 independent stereo positions, versus for the Amiga 4 channels linear and only 3 stereo positions, and with much higher output frequency for the Archies. Computational power of the Archie is 6 times what the Amiga could deliver. Comparing the Amiga and the Archie is like comparing a ZX Spectrum (1st gen, with the beeper) and an Atari ST.
does any1 know about the Dragon 32 or 64? I typed in a word processor for it in machine code but I got the save addresses wrong a bit. can some1 tell me what the real save addresses are? thanks. Dragon User 1985 I think it was.
hmm at the instuction sets amd had the actual (amd64) is that was Intel uses too on the 64bis cpus. the amd64 insruction set is a hybrit version between risc an cisc.
Plenty tried. MIPS (the original consumer RISC architecture), POWER, Alpha, i860, SPARC, RA-RISC, &c., all could've potentially, but nobody really filled the same niche as ARM. There was no shortage of competitors.
This CPU is "the" DADDY of all CPU`s and could you tell me,What/where is it now, please? The RISC instructions was the way forwards leaving intel and such pondering at the FUTURE! :) The CPU market has always fascinated me right from the zx80(zilog) and I was wondering if the CPU fraternity was aware of the RISK CPU INCEPTION!!. L0L:)
Everyone’s now moved to some form of ARM even for desktops. Basically their predictions were all proven to be correct. That as machines became smaller and or portable low power processors would rule. Also because they produce less heat and can theoretically scale up better they were always sooner or later going to displace Intel which did the opposite. They were power hungry, produced immense heat and as it’s now turning out don’t scale up well.
I love Acorn, my first computer was an Electron and I've had many others, but compared with the Amiga 1200 that launched around the same time as the A4000, the difference is night and day and the Amiga was £600 cheaper.
It is unfair to compare the price of the A4000 (separate casing and keyboard) with the all-in-one A1200. You should compare the A1200 either with the A3010 or the A3020. Then the difference of the prices isn't GBP600 but drops to below GBP250 ... And the Acorn machines, with their ARM250 running at 12 Mhz, are still, again, lightening faster than the Amiga. It was obvious for productivity applications. You should definitely try to see Artworks working on an Archie, or apps like QRT when rendering a scene, about 8 times faster than on an A1200. Remember too the Archies had hd diskette drives (capacity up to 1600 kbytes), and still 8 channel stereo sound, when the A1200 was still stuck with double density diskettes and still 4 channels with only 3 stereo positions (7 for the Archies, independent, and as demonstrated on my channel all Archies can play tunes at a much higher frequency than Paula can, ie over 61 kHz for 4 channel MODs, and over 41 kHz for 8 channels).
Zarchos your preaching to the choir to a point, as I said I'm an Acorn fan, I also own an A4000 and my original RISC PC which is my baby. However by this period in time everyone I knew creativity wise had Amiga's or were slowly migrating over to Mac (as I did) apart from musicians who with very few exceptions all had Ataris. Amiga got what were at the time becoming industry standard tools for many applications and expandability was also pretty exceptional which made a massive difference, even the unexpanded A1200 was a realistically useable option for high res image and video work, although the Amiga 4000 would be the preferred machine. Wander in to a design house back around '93 and if there were computers you'll either see Amigas, Mac Performas or in many cases a combination of the two. A huge amount of hardware vendors, especially in Europe, were already heavily invested in Amiga and in no time at all there were extraordinarily fast accelerator cards with, for the time, gigantic amounts of RAM. A PCMCIA slot on the Amiga 1200 was also an incredibly good idea! Acorn seemingly had sparse third party support by this point and little marketing to speak of, I don't think at the time I was even aware that they were still in the business until there was a buzz about RISC PCs a few years later. I actually far preferred RISC OS to Workbench from an aesthetic and functional point of view. The fact is that Acorn was so far off everyone's radar, and perception was that Amiga was doing things better, and for the most part it did. The fact the A4000 may have performed better in some instances is academic as it was performing better in software none of us were using and for applications nobody appeared to be using them for.
I do not say the opposite : I stressed 2 points where you were wrong : 1/ the similarities between the machines : that's a 'no', and 2/ the price difference, which you say was big : again it is a 'no'. These are facts. Commodore marketing was much better than Acorn's. YES. With a lot of hype and lying to the Amiga users about what was going to come next ... Commodore machines were sold worldwide, not Acorn's. It doesn't mean the Archies and later on the RISC PCs were not used by the professionals, and had not great apps even the PCs and the Macs didn't have back in the days. It is a little bit the 'happy few' knowing why they used Acorn machines and their apps, and believe me : yes they were really happy. Remember : with low margins, a lot of money spent on advertising instead of R and D, Commodore entered a war price vs Atari, the Macs and the PCs. Acorn did not. Eventually Commodore and Atari went bankrupt when Acorn split its businesses, sold them, and then closed business (no bankruptcy). Hermann Hauser is worth over a billion sterlings today. What about people at Commodore or Atari ? Selling carpets and T shirts in Soho ?
Zarchos external keyboard wise I'd have to concede as there was no option for that on the A1200 until a few years later, not that it made a blind bit of difference on the desk. Monitors were an option at purchase and none of them were £600.
It is not only external keyboard : it is possible hardware expansions inside the casing too (plus CD ROM slot): so yes it does make a difference, and it is of course in the price. The difference between a toy computer (Amiga) and a real computer (Archimedes). It is like when I read price comparisons between the A300 or A400 series vs the Amiga500 : these machines should be compared with the Amiga1000 or Amiga2000. Honesty isn't wide spread among Amiga users.
It was an accident, in the sense that as-low-as-possible power consumption wasn't an up-front design goal. It just turned out that the first chip was really, really frugal.
oiSnowy for sure its a fascinating story their frugal risc design creates a low power wonder! And now holy Jesus, the entire planet depends on arm CPUs , there are more of them out there than any other CPU !!!! 🙄🙄🙄🙄🙄🙄🙄
Well power consumption is a major design constraint but then ARM were very fortunate in having sight of the Berkley white paper .They also gave up Spice to Acorn.
Yes, complex or complicated. Compare them and you'll quickly understand x86 is hell, and ARM is simply paradise. Now it is true it is not 'computer' but 'chip' which should be used.
Why not "Convoluted Instruction Set Computer", if we keep defending a typo and turning it into a stubborn mistake? I can choose to call it "Contemporary Instruction Set Computer".
If you look at minute 3:37, then my comment, and then your comment, you will understand. PS: The only correct version for CISC is "complex instruction set computer".
The A4000 is the first step in what makes modern computers boring and uninteresting. The ARM is great but the ARM250 is a soulless chip. SoC is where computing became a "black box" for the consumer. For me. its the actual line in the sand where computing became uninteresting.
i had one or three of them, i ran computer clubs back in the day and we compared it to the Amiga, ST etc,,, pitty about it.. the british have lost its ability to own it IP.. We are far too willing to sell up and hand it over to investors.. Of course, it not owned by the british any more, and it may never will. There is a hope in the open source RISC, but I fear that it will either die die to lack of investment, or bought by the mega corps... and crippled..
More like the P3-->Core Duo Lines. The P4 was an abomination designed purely to get maximum mhz counts. It had an insanely deep pipeline. Intel obviously realized the error of their ways and if you look at the Pentium M on, their core is basically the Pentium 3 with a bunch of stuff added. AMD were the ones to really pioneer RISC as the core of an x86 CPU. I haven't looked at Ryzen all that much, but I do know that Bulldozer back to the AMD64 were all basically RISC units at the core with an x86 interpeter in the pipleline breaking down instructions into something their internal core understood. Thing is, anymore, the legacy x86 instruction set takes up a very tiny portion of the transistor count these days. Its pretty much negligible. Clock for clock the PC processors kill ARM with much higher IPC. I don't believe ARM even has branch prediction yet. Like my current i7 in my laptop at 3.5ghz is giving me about 35 GFLOPS. When I looked up the Nexus 6p which is my current phone, its octacore ARM processor is only about 5-7 GFLOPS roughly. It also runs at about half the speed of the i7, but if you extrapolate the i7 is easily 2-3x faster clock for clock. What's amazing is how far ARM has caught up in the past decade. My 2009 G1 overclocked to 650mhz wasn't really all that fast at all.
Sophia Wilson and Steve Furber should be given knighthoods atleast for their contribution to the world in starting it all, I'm just reading the history of ARM now and I guess they were just employees after all and the rights belonged to the company at the time and the rest is history, but yea..wow..what a contribution.
ARM is amazingly good given its extremely low power consumption level. WTF does the Atari ST/Amiga have to do with anything? They predate the ARM architecture so... no shit!
@@rogerwinter1563 ARM wasn't "released" in 1985 it was in *prototype* stages FFS! ARM2 was the first commercial ARM device. What exactly do you mean with "aint found everywhere"? ARM exists in an overwhelming amount of devices in today's world. The ratio of ARM devices vs. devices running other architectures is staggering to the point it might as well be everywhere. What exactly are you trying to convey with your first statement anyway?
@@Banzeken you really are belabouring the point. Arm is attached to your hand which if moved up and down rapidly whilst holding another part of your body perfectly describes you
@@rogerwinter1563 Says you, the fucking jerkwad, taking a statement way the fuck out of contex *AND* getting basic facts wrong on top of all that. No one said ARM existed literally *everywhere* you absolute bumbling asswipe! NO FUCKING SHIT! Thanks for spelling it out, dickcheese! Pardon me for assuming I was going to get an exchange with even a smidget of intellect here. Clearly you're a waste of everyone's time.
If you look at early 1990s games overall on the Archimedes you will see why it failed, arcade and action games were all really badly converted from less powerful Amiga 500/1000 hardware and these were the best games available unlike the PD graphics looking original/exclusive Archimedes titles. It's a great machine but when Lotus Turbo II/Chaos Engine/Zool II/Pacmania are either stock ports or more like the dumbass MS DOS versions (like the god awful in game music of Chaos Engine on Archimedes which is nothing like the Amiga 500 version even though the machine is technically 200% better at sound than the Amiga 500). A great example of a machine killed by lack of high quality games software in the time of A3000-A3010 home targeted versions of this great machine.
You should consider 2 things at least : the sound isn't emulated properly so never believe the sound of the Archimedes you hear on YT is the real sound if it is Under emulation. Acorn only sold properly their machines in the U.K. If for example they had started selling in Germany the Archies from the A3000 (with German OS, keyboard, etc ...) in 1989, and not with the ARM250 based machines and RISC PCs in 1992 onwards, things could have been very different. Now consider this too : Acorn didn't go bankrupt like Commodore ... by not entering a price war Acorn successfully produced the ARM2 based machines, ARM3s, ARM250, then RISC PCs with ARM610, ARM710, Strong ARM and they even had the Phoebe ready. All in all, hardware-wise, Acorn did much better than Commodore or Atari, and its founders earnt tens of millions (GBP) with the ARM spin off and other activities too. Can you say the same with the Commodore team or Atari team ? Certainly not. Listen to the latest interviews of Hermann Hauser, you'll see how clever these people have been. And visionary too ... PS : Copper-like colours and much more was perfectly doable. It's been done by Steve Harisson thanks to his RasterMan module (see video on my YT channel). This alone would have changed a lot for the games and demos on the Archies. User base was too small, and developpers were too few to really use what the Archies have in their guts. See again on my YT channel what a really optimised sprites plotting routine can do : much more and with far less constraints than on the Amiga, and in 256 colour screen modes. Had people like Psygnosis or Team17 decided to investigate the Archimedes chipset, no doubt much better 2D games would have been produced.
Psygnosis produced more terrible games, Shadow of the Beast 1 by the development team were the real talent and nothing to do with the staff of Psygnosis to be fair. The graphics are an age old problem, unless a UK team was doing a VGA version DOS PC at the same time as Amiga the graphics would remain the same as Amiga (which is annoying as the A500 onwards were cheap nasty non-improved Amiga 1000 chipsets in much uglier cases). There is nothing wrong with the hardware, if you take something like Sword of Sodan and check out the 65816 based Apple II GS version vs the Amiga version (which is coded pretty much using all audio visual libraries contained in the on board rom so hardly pushing the actual hardware limits of the Amiga) you can just imagine what was possible. Games like Afterburner or OutRun were more than technically possible. My point was the talent to write the RISC code AND a pixel artist of high enough calibre never found themselves working on the same game. PS I am from the UK and I have owned an A3010 for decades and apart from the underwhelming games library and less heavy hitters in the artistic/creative software department like Lightwave or DeluxePaint's animation AND painting package totally integrated 2.5D package the machine would never be able to replace the Amiga's fighting chance against the overpriced useless DOS/Win 3.1/95 rubbish everyone seemed to buy as that was a 'safe bet'. The software sold the Amiga at the end of Commodore's life, and at the start of the Commodore Amiga's life Marble Madness perfect coin-op conversion in 1986 was a killer app to make every teenager playing on their ZX/CPC/64 want an Amiga...any Amiga....even the pig ugly cheap and nasty build quality Amiga 500 utter crap :) Zarch is an AMAZING game and Virus on ST/PC/A500 is rubbish but after that things kinda didn't improve by the time the A3010/3020 faster ARM home computer based machine format cases went on the shelved. The fast screen memory and fast CPU of even the first Archimedes on sale were well known by ACE magazine to be fast enough to manipulate the screen memory 50 times a second without running out of CPU power so like the 1987 Amiga 500 people did want it for the specs alone too.
I'll enhance this answer but please : NO ZARCH IS NOT AMAZING AT ALL ! It's been coded for the ARM1 (so there are no multiplications in the code), the filling routine is slow ... many more optimisations would have been possible. Take a look here : stardot.org.uk/forums/viewtopic.php?f=29&t=10313&hilit=hacker+for+zarch with the links given to some YT videos. See StarFighter3000 and you'll understand how Zarch doesn't even scratch the surface of what a well programmed 3d game can be on the Archies.
I originally incorrectly wrote there was MLA etc in Arm 1 - I really meant the first Arm in Archimedes machines had multiply instructions. So sorry that I missed the point being made - Zarch was coded on an Arm 1 proto then ?ARM 2 had MUL and MLA instructions but no division.(Edited and corrected - Arm1 wasn't in the first Acorn Archimedes machines)
Re: Mike Hunt - Might not have been so bad on the RT if Microsoft hadn't enforced it's pathetic "Modern UI Windows Store" ONLY requirement for third party apps, which allowed no normal windows apps, no services, no drivers to be developed by third parties. It failed not because of the ARM chip, but because of the fact that no one was allowed to port over their Win32 based apps to the platform. A rather silly mistake to make IMO, and they are repeating this yet again with Windows 10 S
Umm, Archimedes as actually nothing that amazing and RISC CPU's have gone back in CISC direction. I don't know why they even call it RISC anymore. Besides, the Amiga was the more revolutionary for the time and terribly under rated. It has influenced computing today more than the johnny-come-lately Archimedes. Acorn failed becuause it was too little, too late. In this video you make out that it was so ahead of its time when in reality it was beaten to the market by a superior competitor who you FAIL TO MENTION (because you're scared?) by years. The Amiga OCS 1000 and 500.
At 4.6 MIPS with 32 bit instructions in 1987, the processor in this thing was certainly faster than most things around at that time, especially in its price range high though it was. Some operations which would take other CPUs multiple instructions even came for free - IE move to another register a rotated or shifted value from another register in one single cycle. Another fabulous speed up is something like "ADD r0, r1, r2, lsl #3" (r0 = r1+ r2Shifted left by 3 places) a single cycle operation which would replace 3 machine instructions on a traditional processor of the time, including the 68000, which IIRC took a cycle or more for each shifted position as there was no barrel shifter. It was a thing designed for real computation, not just for playing games on. Acorn tested every processor available for their new machine, and couldn't find one good enough, so they designed their own. No mean feat for a small British company, and they should be given due credit for that at least.The OS was super friendly by the time RiscOS 2 was released. But even from the start it had just about everything needed to write a game. Basic, and Assembler. or even a mix of both. As said, The Pacmania code was written using nothing other than the inbuilt Arm assembler. Pachmania took some 2 or 3 seconds to assemble the program from its Arm source code contained in the BBC basic environment.
Miggy4Eva Was it a failure though? When you consider the processor born out of it is now the most used processor in the world, I'd say it did pretty well. I'm a huge Amiga fan and it has a lot of nostalgic legacy, but the BBC/Archimedes projects resulted in a far greater real world legacy in ARM.
RetroManCave just because it had a great legacy, it doesn't mean it was particularly successful as a product. Look at Edison's phonograph. The first time any had ever heard their own voice. But within 20 years Edison's machines had all but gone, to be supplanted by shellac (later vinyl) records.
Shaun Hollingworth compare the 4.6 MiPs Archemides to the 386 at around 12-13MiPs and which predated it by 2 years iirc. Or just wait for the even faster 486 a few years later. And the later Archimedes had no chance outperforming a Pentium.
Pacmania for Archimedes was regarded as one of the best conversions of that game to the home computer platforms. I'm not saying that because I converted it, though I did. I am saying that because it really was better than anything else at the time. We also added special enhancements which took advantage of the superior speed of the CPU - The software rendered sprites in Pacmania were bigger than those of the Amiga, because of limits on its hardware sprites, the screen was pixel scrolled in software, using the Arm's barrel shifter instructions, in far less than a television frame and was much bigger than the ST which only scrolled half the screen because that's all we could do with the far less powerful 68000, on a machine with no hardware scrolling or sprites.Did you know that our earliest games (including Pacmania and Terramex) were written using the onboard Arm assembler contained in the Acorn's BASIC interpreter ? There was NO other machine on which something like that was remotely possible. Only later did we use a custom in house Cross assembler specially written by myself. to make conversions easier.We who worked at Krisalis Software back then were very proud of the Acorn conversions we did, and still are.Shaun.
Amiga users don't understand there was no need of a blitter because the Archimedes can blit transfers of data thanks to its architecture : after 1 register (32 bits) transfered, all subsequent registers (32 bits) take only 1 cycle (125 ns on an 8 Mhz machine) EACH !
Read the comments in my video here :
th-cam.com/video/hU23fEL154k/w-d-xo.html PACMANIA : Amiga vs Acorn Archimedes. Which version is best ?
It is a pity Amiga fanboys stay arrogant simply because :
- they have no clue about the weakness of their beloved machines
- they demonstrate their utmost ignorance of the ARM chipset and the brilliant design by UK Acorn for the Archimedes.
Maximising the usage of the memory bandwith was at the heart of the ARM design.
On the contrary, the Amiga used an outdated off the shelves 68000 technology from 1979, with a poor design as far as memory bandwith efficiency is concerned.
The idea of the blitter chip is brilliant, but it is there to compensate the miserably slow 68000.
Even a 65C816 does memory transfers faster.
Add to this the barrel shifter with an immediate value costs ZERO cycle on the ARM (it costs an extra cycle on the 68000, or it doesn't even exist until the 68020 is released, I'll have to check).
Furthermore the way the LDR instruction works on the Archies, rotating bits when the address isn't word aligned, gives the machine a great advantage to build very efficient fast plotting routines.
With all instructions taking 4 bytes in memory, building 'engines' to create in realtime generated code to plot sprites is made much easier (and way faster) than on any other systems.
This is why the game Scorpius is so amazing and people can't believe it works at 25 frames per second on an 8 Mhz machine : it keeps generating tailored ARM code sprites plotting routines to plot the sprites, with an efficient memory manager to free the unused sprites and generate the assembly code for what will have to be displayed next.
th-cam.com/video/t4hfPKWM4Mk/w-d-xo.html
This is exactly what I have been creating (without knowing they did that for Scorpius) because there is no other way to fast plot sprites on the Archies, but my routines are even more optimised, and they can work with RasterMan hardware shifting the start of each horizontal scanline, or mirroring the screen, or whatever DMA video memory redirection and VIDC horizontal start offset is carried out per scanline.
Watch this : th-cam.com/video/K44V7ps_tfY/w-d-xo.html : the Amiga just can't compete !
This is a 320x256, 256 colour screenmode at 50 Hz and any pixel of the sprite can be any colour 0 to 255 : a crippled machine like the Amiga will never ever be able to do it. Its copros are weak and full of limitations : for a computer supposed to have the power of a console, it is a pity.
Look at the Japanese machines : they were clearly superior technical achievements.
Do not forget IIRC Pacmania works on a 512 kbyte Archie : the A305 (fortunately discontinued after 6 months : 1 Mbyte RAM became standard).
With 512 kbytes, It is not possible to have the graphics pre shifted in memory (on offset 0, 1, 2, 3 from a word boundary, ie an address multiple of 4 bytes), it is why Pacmania does the loading, shifting, gathering and storing all this in realtime, for each frame.
To me for 2D games you have to use fast generated code, these routines rely on the existence in memory of preshifted sprites to plot these sprites ; so you need a 2 Mbyte machine, if your game is to have animated sprites, musics and sound effects.
Here the architecture of the Amiga is more efficient as far as memory needs are concerned.
Pre-shifting really was totally unnecessary on ARM, so I don't see that the Amiga would be more efficient than the ARM was in this respect.There was still plenty of time left over at the end of the screen raster after everything was done. Don't forget too, all the sound samples were being read, scaled, and shifted into the sound buffer during this raster period too! When masking onto the screen, one had to deal with up to an extra 7 pixels (4 bits per pixel mode, in a 32 bit aligned access) even if one had pre-shifted data, but the shifting was implicit in the MOV, AND, and OR instructions, and one was always working on a 32 bit aligned boundary with the multiple load and store instructions. The shifts themselves took zero cycles, as they were always part of the machine instruction, even when no shift was required where one would just leave out the shifting operand in the instruction source. In any case a logical operation or MOV shift or otherwise would execute in a single cycle. Pre-shifting graphics gave little advantage on the machine. We didn't store any form of graphic mask either. That too was generated on the fly pixel by pixel (0000 being transparent) once the whole sprite width was shifted across a few registers. It was amazing just how fast this thing ran back in the day. 512k RAM - Yes that was with the OS still in place and running too IIRC. It was the only machine where at the time we didn't completely trash the OS, and run on our own code to drive the hardware. Our later games always had a clean return to the desktop. Nowadays nothing less would be expected!
Yes in the case of Pacmania preshifting isn't necessary because you can do it 'on demand, in realtime' and still run at 50 fps.
It is ok because you used an 4 bit per pixel screenmode, and not an 8 bit per pixel one like mode 13.
You end up with sthing very similar to the Amiga.
Also the memory constraint (512 kbytes to have everything running) makes the use of preshifting quite impossible.
For other games, to fully exploit the Archimedes capabilities, it is necessary.
Have you seen this, Sir ?
th-cam.com/video/t4hfPKWM4Mk/w-d-xo.html Scorpius game demo
It can't be achieved with shifting in realtime : you must use preshifted graphics and generated code.
It was to have music too, and it runs at 25 fps, screenmode is 320 x 220 , 256 colours.
On a 1 Mbyte machine, the usage of preshifting would have made Pacmania run in a 256 colour screenmode, overscan 384x288 at 50 fps.
Now with the existence of RasterMan allowing greater usage of the hardware features of the Archimedes, it would be even simpler to achieve : it makes the Archimedes look like an Amiga for many of its copper and blitter operations.
I wonder if the business was very good for you converting games to the Archies.
To me I believe so : it must have been very profitable because you had no competitors and piracy wasn't important on the Acorn machines.
A very smart move, from a business point of view ;-)
Bravo !
Kudos to you
Thanks for this. I was proud to be part of the team that brought the first Archimedes to market. An amazing time in British technology development and one that has left an incredible legacy.
Definitely. You should be very proud.
For years I kept in touch with Sue Wall, who looked after all the ISVs (Independent Software Vendors) but I've since lost touch with her. If you have any details I'd be grateful. I always remember well, David Lowdell, who always took an interest in what we were up to.
I also recall RISCOS as a pioneer of anti-aliased scalable fonts which was simply not a feature of home computers of the time.
That was Sophie Wilson's work too!
"we came away from there thinking; if they can build a processor, then we can"... Utterly glorious. 😊
I remember the Archimedes quite well. ARM chips etc. My school had some of them and, for a short time, my brother and I were obsessed with them and wanted to save up our pocket money to get one. Of course, that lasted about two weeks before we lost interest, as kids tend to do.
Also was the first full 32-bit operating system with a WIMP interface. Think how long it took Windows to become 32-bit!
Windows 3.1 came out in 1992, regarded as the first useable version of Windows, RISC OS came out in 1987 and instantly was useable and powerful.
I have to admit, even though I knew of the ARM250 I didn't realise it was the first system-on-chip! Impressive.
Sofie Wilson and Steve Furber deserve more recognition for their work on the ARM.
Great video, I'm sitting watching it with my Acorn A4000 set up on the desk next to me. I rescued it when I was at school, it was getting thrown out because it had been replaced with a classroom full of shiny new Siemens PCs. All these years later it's still working well, now with flash based storage.
Similar story here. I've got three Risc PCs under my desk, including a two slicer! I bought a 3010 when I was in University in 92. Damn, the things I did on that machine. Wonderful.
Very well put together documentary video. Concise and to the point. Lots of information in just 8 mins 12 seconds.
I remember seeing the Archimedes game Zarch on Tomorrow's World and being totally stunned at the graphics. It was really impressive back then.
I played Zarch a lot on my dad's Archimedes. I played Conqueror too, a tank game with similar graphics that I think shared much of its development with Zarch. Awesome, incredible game. So far ahead of its time.
My vary first computer was The A3010 (the one with the Green Function keys).
You can try RiscOS now on the Raspberry Pi.
Yup, a free download from RiscOS Open Ltd. Even works now on the RPi 3.
I remember hearing several years ago that there was double the amount of ARMs on earth than there are arms.
Well, the worlds population of primates is around 7.5 billion, so that ca. 15 billion arms. By now there are 180 billion ARMs. That's 12 times as many ARMs as arms.
Now ARM returns to the desktop with Apple Silicon starting with the Apple M1.
The first computer I ever used was an Acorn Archimedes in 1990. Nostalgia.
My secondary school was full of these computers (we had the bbc micros in primary school), They had a huge network of them with every student getting a user account. You could log onto one in the library and access your documents you had been working on in the IT room only moments before it blew my mind as i had a c64 at home and had never experienced networking of any kind. Glad i got to spend a bit of time with the machines, asoon as we left school everyone was switching to pc/windows and the acorn died off pretty fast, ive never seen or used one since.
The ARM is still the best desktop CPU in my opinion:
My first thirty-two bit system was an Archimedes A310. Today my primary computer now is a Raspberry Pi B+ running RISC OS 5.24 (A system with many of the same people behind it as the original A310, in modern times). I only use a linux system (Raspberry Pi 3B) for heavy web browsing (js intense pages, or video eg youtube).
Have you got DosBox running? There is more than enough DOS you can still download on the www.
everyday use: a 1994 (23 years old) RiscPC.
replaced CMOS battery. Swith off only to try or install another SCSI card.
Best computer investment ever !!!
What does a riscpc run ?
RiscPC you mean? paul durao. Riscos Microsoft Windows and Linux.
This computer really did deserve more recognition than it got...the gaming potential was huge.
I've never really given it a lot of thought but having worked 25 years in tech, sure, pretty damn important computer after all. Thanks lad.
Used to use this at school, Chocks away
Magic you've kept it all this time
My Dad bought an Archimedes (1994ish?) for some kind of music editing software I believe. But what it ended up being was at the age of 5 this became where I got a huge love of games. We already had an Atari in the house, but I found it too slow and unreliable. I still cherish Lemmings (even now, my favourite game ever!), and tons of other stuff like Pacmania and Twinworld and a large number of kids' educational software for English, Maths, Science, IT. Even when I got my Playstation a couple of years later, I still used the Acorn for many years. I'm 28 now, and still singing its praises to everyone who'll listen!
Your Dad certainly bought Sibelius, the highly acclaimed music composition software, with over 20 international Awards for its excellence. Check with Google. www.avid.com/en/sibelius This is again one of these killer apps we had on the Archies, and so few people know about.
If you are still good with your Archimedes LemmingSplat why not try splatting a few Nazis in Wolfenstein? I never tried to splat a Lemming but the odd GrupenFuhrer can be very satisfying. I will go along with you on the educational software side though, I believe !Numerator was the best educational software I have seen. I would change your on board battery if you have the A7000, in fairly short order!
Our high school had them in the 90s... decent machines to be fair
I had an A4000 (and an A3000 before and a Risc PC 600 after that). It's been great watching the other OSes catch up - Mac OS X adding the 'Dock' (icon bar...) and context-sensitive menus becoming more and more de-rigur. If only Acorn had won 'marketing' as well.
Everyone stole from Xerox. Or should I say "copied"? ;)
Writing the excution code for the ARM - What an achievement!
I used to write for Arc World and Acorn User -. I loved the BBC micro and onwards. I'm looking foir the start up flute like bong of my first A305. Has anyone got it ? The first Aurtur upgrade lost the bong sadly. It was magic !
I had the BBC and followed Arm all through my electronic career, never bought one (until the RP). I looked at Arm for home based consumer tech and it wasn't competitive at the time (late 90s early 00s I think). We were looking at building digital recording for analogue TV signals, prior to DTV broadcast. Most stuff was running off ST processors with built in MPEG decode hardware, the arm could do it in SW, but still needed a hardware chip to do encoding. It was only with mobile electronics it really hit home with its low power consumption. That was really an accident by Acorn because they were just looking to save money on the package and go with plastic.
The last company I worked for did security camera design and the market leader for low power was gopro (arm based). They wouldn't license and we had to use a more standard Arm part and suffer the increased power dissipation in the camera, this reduces the life and increases complexity and size. Processing efficiency is now a massive part of the game. The more processing and less heat you can put into a camera the more features it can have like object recognition, left objects, people tracking etc.
I missed out on the Archemedies... I had just discovered our college's second-hand PDP-11 (the previous most important computer in the world) By the time I got back into micros, everyone had Amigas. Huge hole in my cultural upbringing... took me 30 years to discover the joys of ARM.
Had no idea this existed till today, interesting story behind that processor.
really great vid. Thanks. And now, Apple have announced switching their Macs to ARM in the next 2 years, haha
Is she talking about Chuck Peddle at MOS/CBM?
No, Western Design Centre. They were a second source for MOS chips. While MOS were tied up making chips for the C64, WDC were selling to Apple, Acorn, and Nintendo.
Ace video! Never heard of? I remember the Archimedes very well - especially Zarch and the Lander demo. One of my schoolmates said that if he had one, he'd have permanently bunked off school and stayed at home programming it instead. :D
ive missed your video glad to see you bk m8
Nice video - great to hear the designers talk about it. Although the Amiga is - The most important computer... IN THE WORLD (IMO)
It's hard to comprehend how significantly it changed the world
We had the a4000 at school back in early 2000. I finished secondary school in 2005 and they was replaced shortly before i left. I remember putting my hand on the monitor and ZAPPING my mates with the other 😂.
With Apple silicon ARM is back on desktop, but yeah, OSX is a bit worse than the RISC OS.
My first first 'proper' machine was an A440 [I think]
It was replaced by my 2nd 'proper' machine - A5000 - AWESOME
PS. The 3 button mouse WITH *MIDDLE* "menu" BUTTON - best WIMP UI interactive device...
- MS Windows ; each window needs individual real-estate
- Apple macOS : [dreaded] top of screen
Never heard of? I used to write software for it. Good old RISC.
indeed they were so worried about heating they went all out for low power use. turns out that was great for us all!
no mention of amiga as a competitor, just a commodore 64 ad?
Another UK computer genius: Geoffrey Dummer from Hull, UK. See his Wiki page. Who would have thought that a Yorkshire man should be one of the prime movers in computer technology. He should be better known.
You should focus on Seymour Cray. He is the father of RISC.
Great video. I had an Acorn A3000. Then a A3010 then an Acorn Risc PC. Now i have a iPhone and iPad all arm based
In UK schools, especially Northeast these were the main computers in the 1990s.
We even had the "Computers for Schools" scheme ran by Tesco. I have an old VHS video of a family member being presented with an Archimedes by a Tesco store manager.
Fact: the fastest PC's on the market today are generally used as toys. They are used to play games, not to compute complex models of the universe.
if you want to design a processor , just do it , how hard can it be?
Understanding Boolean logic (algebra).
One must ask, when it comes to the graphics support, what about the Amiga? Also in terms of pricing, the Amiga (and indeed the Ataris) were much cheaper. I think the A3010 was always more expensive, despite having a much similar spec. The Ataris had off the shelf components in the chipset, the Amigas had highly custom chipsets (OCS/ECS/AGA) and the Acorn did SoC.
That said, the Archimedes were still groundbreaking. I recall reading that Acorn were way ahead of the time with their font support; they had anti-aliasing figured out in RiscOS 3, and it always looked beautiful for typography. I remember writing things in Pendown+ and marvelling at how clear everything looked on screen. It took until Windows 98 I think it was, for Microsoft to catch up. That technology was one of the major reasons Pace gobbled up the remnants of Acorn, as they incorporated a lot of how RiscOS displayed things on screen into the set top boxes. Probably still little bits of it kicking around in STB's today!
David Rickard For certain types of games (platformers, shmups) the Amiga was king. The Amiga custom chip set was specifically designed with these types of games in mind and its coprocessors served to allieviate the burden on the CPU. Amiga ports to the Arch were usually serviceable, but still inferior to the original (no copper to produce raster bars for instance). However the ARM processor packed more grunt than the Amiga's 68000, so the Arch had an edge when it came to 3D polygonal games. The Amiga chip set simply wasn't designed with these types of games in mind.
Had RasterMan been created 30 years ago all games would have been much better on the Archimedes since RasterMan mimics 90% of the hardware facilities offered by the copper and the blitter chip + extra features these Amiga chips can't offer.
I'm still not sure how he got round the latencies on the timer interrupts here. I tried everything to keep the switching point(s) stationary. Perhaps if we'd just taken complete control of all the hardware, and suspended the OS completely as we did on every other platform it might have been possible, but it would certainly have been frowned upon I think!
Shaun Hollingworth This is what Steve did. He had to write his own keyboard handler too, and to modify his QTM sound routines, in a way the IOC timer, when it reaches zero, triggers an interrupt which is first serviced. At least, it is what I understood. The author of the cMEMC demo, a German Acorn enthousiast, Marko Lukat (aka BlackIce) (see on my YT channel th-cam.com/video/LeTlgOfLi40/w-d-xo.html ) did that too, and he was 1st.
It is because I saw this demo that I contacted Steve to further develop the idea, he agreed to come back programming for the Archies, and blessed us with some other great pieces of code like LCDGamesModes and VIDCblaster.
He was also involved in the software preservation project by Johnathan Abbott, as a 'VIDC master' consultant and to provide a version of his QTM tracker able to replace (and obey the commands from) your Krisalis' sound routines.
Join us on the stardot forum if you want to know more.
No the Amiga and the Archies had not similar specs : the Archies could natively display hi res screen modes with 256 colours, non-interlaced. Even the sound has no comparison : 8 channels log for the Archies with 7 independent stereo positions, versus for the Amiga 4 channels linear and only 3 stereo positions, and with much higher output frequency for the Archies.
Computational power of the Archie is 6 times what the Amiga could deliver.
Comparing the Amiga and the Archie is like comparing a ZX Spectrum (1st gen, with the beeper) and an Atari ST.
does any1 know about the Dragon 32 or 64? I typed in a word processor for it in machine code but I got the save addresses wrong a bit. can some1 tell me what the real save addresses are? thanks. Dragon User 1985 I think it was.
I am watching this on a RISC device
hmm at the instuction sets amd had the actual (amd64) is that was Intel uses too on the 64bis cpus.
the amd64 insruction set is a hybrit version between risc an cisc.
I remember them in school. Ahh
Oh my god you uploaded again! :D
Yay he's back
Hi. What's the difference between ARM RISC and PowerPC RISC chips. Both are RISC. God bless, Belated Merry Christmas. God bless, Proverbs 31
Did you ever play crystal rainforest?
great video
No mention of the Amiga? LOL
I'm sure a would-be ARM competitor would have showed up and done its job at some point.
Plenty tried. MIPS (the original consumer RISC architecture), POWER, Alpha, i860, SPARC, RA-RISC, &c., all could've potentially, but nobody really filled the same niche as ARM. There was no shortage of competitors.
@@talideon If ARM wasn't there, surely one would've taken precedence? What is the niche that specifically ARM filled?
Do you not co operate with Roleplay UK anymore?
This CPU is "the" DADDY of all CPU`s and could you tell me,What/where is it now, please?
The RISC instructions was the way forwards leaving intel and such pondering at the FUTURE! :)
The CPU market has always fascinated me right from the zx80(zilog) and I was wondering if the CPU fraternity was aware of the RISK CPU INCEPTION!!.
L0L:)
But can it play Crysis?
Thanks! Subscribed!
What happened why no posts
I use #FreeSatelliteTV Receivers and they all run Linux on a ARM processor to make the video work!
Is the name of this at all related to LGR tech tales?
Entirely. Clint inspire me :)
Everyone’s now moved to some form of ARM even for desktops. Basically their predictions were all proven to be correct. That as machines became smaller and or portable low power processors would rule. Also because they produce less heat and can theoretically scale up better they were always sooner or later going to displace Intel which did the opposite. They were power hungry, produced immense heat and as it’s now turning out don’t scale up well.
Names of the songs?
he's back :-)
Excellent video. 'Normal' people never have a clue when I explain this to them. Not worth the bother.
Acorn may have been "Cashed out" by Intel and Microsoft in schools, but Acorn had the last laugh.
Thanks in part to Apple
@@Claro1993 True.
I love Acorn, my first computer was an Electron and I've had many others, but compared with the Amiga 1200 that launched around the same time as the A4000, the difference is night and day and the Amiga was £600 cheaper.
It is unfair to compare the price of the A4000 (separate casing and keyboard) with the all-in-one A1200.
You should compare the A1200 either with the A3010 or the A3020. Then the difference of the prices isn't GBP600 but drops to below GBP250 ... And the Acorn machines, with their ARM250 running at 12 Mhz, are still, again, lightening faster than the Amiga. It was obvious for productivity applications. You should definitely try to see Artworks working on an Archie, or apps like QRT when rendering a scene, about 8 times faster than on an A1200.
Remember too the Archies had hd diskette drives (capacity up to 1600 kbytes), and still 8 channel stereo sound, when the A1200 was still stuck with double density diskettes and still 4 channels with only 3 stereo positions (7 for the Archies, independent, and as demonstrated on my channel all Archies can play tunes at a much higher frequency than Paula can, ie over 61 kHz for 4 channel MODs, and over 41 kHz for 8 channels).
Zarchos your preaching to the choir to a point, as I said I'm an Acorn fan, I also own an A4000 and my original RISC PC which is my baby.
However by this period in time everyone I knew creativity wise had Amiga's or were slowly migrating over to Mac (as I did) apart from musicians who with very few exceptions all had Ataris.
Amiga got what were at the time becoming industry standard tools for many applications and expandability was also pretty exceptional which made a massive difference, even the unexpanded A1200 was a realistically useable option for high res image and video work, although the Amiga 4000 would be the preferred machine.
Wander in to a design house back around '93 and if there were computers you'll either see Amigas, Mac Performas or in many cases a combination of the two.
A huge amount of hardware vendors, especially in Europe, were already heavily invested in Amiga and in no time at all there were extraordinarily fast accelerator cards with, for the time, gigantic amounts of RAM.
A PCMCIA slot on the Amiga 1200 was also an incredibly good idea!
Acorn seemingly had sparse third party support by this point and little marketing to speak of, I don't think at the time I was even aware that they were still in the business until there was a buzz about RISC PCs a few years later.
I actually far preferred RISC OS to Workbench from an aesthetic and functional point of view.
The fact is that Acorn was so far off everyone's radar, and perception was that Amiga was doing things better, and for the most part it did.
The fact the A4000 may have performed better in some instances is academic as it was performing better in software none of us were using and for applications nobody appeared to be using them for.
I do not say the opposite : I stressed 2 points where you were wrong :
1/ the similarities between the machines : that's a 'no', and
2/ the price difference, which you say was big : again it is a 'no'.
These are facts.
Commodore marketing was much better than Acorn's. YES.
With a lot of hype and lying to the Amiga users about what was going to come next ...
Commodore machines were sold worldwide, not Acorn's.
It doesn't mean the Archies and later on the RISC PCs were not used by the professionals, and had not great apps even the PCs and the Macs didn't have back in the days.
It is a little bit the 'happy few' knowing why they used Acorn machines and their apps, and believe me : yes they were really happy.
Remember : with low margins, a lot of money spent on advertising instead of R and D,
Commodore entered a war price vs Atari, the Macs and the PCs. Acorn did not.
Eventually Commodore and Atari went bankrupt when Acorn split its businesses, sold them, and then closed business (no bankruptcy).
Hermann Hauser is worth over a billion sterlings today.
What about people at Commodore or Atari ? Selling carpets and T shirts in Soho ?
Zarchos external keyboard wise I'd have to concede as there was no option for that on the A1200 until a few years later, not that it made a blind bit of difference on the desk.
Monitors were an option at purchase and none of them were £600.
It is not only external keyboard : it is possible hardware expansions inside the casing too (plus CD ROM slot): so yes it does make a difference, and it is of course in the price. The difference between a toy computer (Amiga) and a real computer (Archimedes).
It is like when I read price comparisons between the A300 or A400 series vs the Amiga500 : these machines should be compared with the Amiga1000 or Amiga2000.
Honesty isn't wide spread among Amiga users.
I don't get how the low power was a pure accident.
It was an accident, in the sense that as-low-as-possible power consumption wasn't an up-front design goal. It just turned out that the first chip was really, really frugal.
oiSnowy for sure its a fascinating story their frugal risc design creates a low power wonder! And now holy Jesus, the entire planet depends on arm CPUs , there are more of them out there than any other CPU !!!! 🙄🙄🙄🙄🙄🙄🙄
Well power consumption is a major design constraint but then ARM were very fortunate in having sight of the Berkley white paper .They also gave up Spice to Acorn.
life is great , you learn something new nearly everyday.
"Complicated Instruction Set Computer"? Really? 3:37
Yes, complex or complicated. Compare them and you'll quickly understand x86 is hell, and ARM is simply paradise.
Now it is true it is not 'computer' but 'chip' which should be used.
Why not "Convoluted Instruction Set Computer", if we keep defending a typo and turning it into a stubborn mistake? I can choose to call it "Contemporary Instruction Set Computer".
I don't get it.
If you look at minute 3:37, then my comment, and then your comment, you will understand.
PS: The only correct version for CISC is "complex instruction set computer".
How is it complex? It's just a list of instructions.
The A4000 is the first step in what makes modern computers boring and uninteresting. The ARM is great but the ARM250 is a soulless chip. SoC is where computing became a "black box" for the consumer. For me. its the actual line in the sand where computing became uninteresting.
i had one or three of them, i ran computer clubs back in the day and we compared it to the Amiga, ST etc,,, pitty about it.. the british have lost its ability to own it IP.. We are far too willing to sell up and hand it over to investors.. Of course, it not owned by the british any more, and it may never will. There is a hope in the open source RISC, but I fear that it will either die die to lack of investment, or bought by the mega corps... and crippled..
Its even in Intel processors since P4
More like the P3-->Core Duo Lines. The P4 was an abomination designed purely to get maximum mhz counts. It had an insanely deep pipeline. Intel obviously realized the error of their ways and if you look at the Pentium M on, their core is basically the Pentium 3 with a bunch of stuff added. AMD were the ones to really pioneer RISC as the core of an x86 CPU. I haven't looked at Ryzen all that much, but I do know that Bulldozer back to the AMD64 were all basically RISC units at the core with an x86 interpeter in the pipleline breaking down instructions into something their internal core understood. Thing is, anymore, the legacy x86 instruction set takes up a very tiny portion of the transistor count these days. Its pretty much negligible. Clock for clock the PC processors kill ARM with much higher IPC. I don't believe ARM even has branch prediction yet. Like my current i7 in my laptop at 3.5ghz is giving me about 35 GFLOPS. When I looked up the Nexus 6p which is my current phone, its octacore ARM processor is only about 5-7 GFLOPS roughly. It also runs at about half the speed of the i7, but if you extrapolate the i7 is easily 2-3x faster clock for clock. What's amazing is how far ARM has caught up in the past decade. My 2009 G1 overclocked to 650mhz wasn't really all that fast at all.
I have heard of it bye.
Sexy computer wasn't it.
Would be great if I still had the magazine. :)
Hi. Why do we have RISC on our mobile gadgets anyway? Why not Intel's low power CISC processors? God bless, Proverbs 31
I'll take a wild guess... Powerconsumption, Performance and Price.
Because it would just explode and you'd be picking the bits of plastic out of your face for 7 years 7 months and 7 days probably.
ahh the beeb
Wow! who owns rights to the ARM chip now?! didn't they patent it?!! they should be billionaires for this invention alone..
They sold ARM to Softbank for $32 Billion last year..
Thanks, sorry haven't watched this vid since my comment, could you refresh me on who they are, thanks. 32 Billion! incredible.
ARM Ltd the British company, but now owned by Japanese Softbank who I think sold 25% of it to Saudi Arabia.
ah I see never knew there was an actual company named after it, thanks again.
Sophia Wilson and Steve Furber should be given knighthoods atleast for their contribution to the world in starting it all, I'm just reading the history of ARM now and I guess they were just employees after all and the rights belonged to the company at the time and the rest is history, but yea..wow..what a contribution.
Arm designed 1985.
And what is Acorn doing right now? Elevators and lifter chairs ;-D
And making alot of money off it
So... The Acorn Archimedes is basically Jesus?
Eye caw seven. Hehehe
not everywhere its not in my Atari ST, my Amiga, or any intel or mac computer so cannot be that good.
ARM is amazingly good given its extremely low power consumption level. WTF does the Atari ST/Amiga have to do with anything? They predate the ARM architecture so... no shit!
@@Banzeken atari st released 1985 hmmm same date as arm. Twat all i was pointing out was it aint found everywhere
@@rogerwinter1563 ARM wasn't "released" in 1985 it was in *prototype* stages FFS! ARM2 was the first commercial ARM device. What exactly do you mean with "aint found everywhere"? ARM exists in an overwhelming amount of devices in today's world. The ratio of ARM devices vs. devices running other architectures is staggering to the point it might as well be everywhere. What exactly are you trying to convey with your first statement anyway?
@@Banzeken you really are belabouring the point. Arm is attached to your hand which if moved up and down rapidly whilst holding another part of your body perfectly describes you
@@rogerwinter1563 Says you, the fucking jerkwad, taking a statement way the fuck out of contex *AND* getting basic facts wrong on top of all that. No one said ARM existed literally *everywhere* you absolute bumbling asswipe! NO FUCKING SHIT! Thanks for spelling it out, dickcheese! Pardon me for assuming I was going to get an exchange with even a smidget of intellect here. Clearly you're a waste of everyone's time.
If you look at early 1990s games overall on the Archimedes you will see why it failed, arcade and action games were all really badly converted from less powerful Amiga 500/1000 hardware and these were the best games available unlike the PD graphics looking original/exclusive Archimedes titles. It's a great machine but when Lotus Turbo II/Chaos Engine/Zool II/Pacmania are either stock ports or more like the dumbass MS DOS versions (like the god awful in game music of Chaos Engine on Archimedes which is nothing like the Amiga 500 version even though the machine is technically 200% better at sound than the Amiga 500). A great example of a machine killed by lack of high quality games software in the time of A3000-A3010 home targeted versions of this great machine.
You should consider 2 things at least : the sound isn't emulated properly so never believe the sound of the Archimedes you hear on YT is the real sound if it is Under emulation.
Acorn only sold properly their machines in the U.K.
If for example they had started selling in Germany the Archies from the A3000 (with German OS, keyboard, etc ...) in 1989, and not with the ARM250 based machines and RISC PCs in 1992 onwards, things could have been very different.
Now consider this too : Acorn didn't go bankrupt like Commodore ... by not entering a price war Acorn successfully produced the ARM2 based machines, ARM3s, ARM250, then RISC PCs with ARM610, ARM710, Strong ARM and they even had the Phoebe ready.
All in all, hardware-wise, Acorn did much better than Commodore or Atari, and its founders earnt tens of millions (GBP) with the ARM spin off and other activities too.
Can you say the same with the Commodore team or Atari team ?
Certainly not.
Listen to the latest interviews of Hermann Hauser, you'll see how clever these people have been.
And visionary too ...
PS : Copper-like colours and much more was perfectly doable. It's been done by Steve Harisson thanks to his RasterMan module (see video on my YT channel).
This alone would have changed a lot for the games and demos on the Archies.
User base was too small, and developpers were too few to really use what the Archies have in their guts.
See again on my YT channel what a really optimised sprites plotting routine can do : much more and with far less constraints than on the Amiga, and in 256 colour screen modes.
Had people like Psygnosis or Team17 decided to investigate the Archimedes chipset, no doubt much better 2D games would have been produced.
Psygnosis produced more terrible games, Shadow of the Beast 1 by the development team were the real talent and nothing to do with the staff of Psygnosis to be fair. The graphics are an age old problem, unless a UK team was doing a VGA version DOS PC at the same time as Amiga the graphics would remain the same as Amiga (which is annoying as the A500 onwards were cheap nasty non-improved Amiga 1000 chipsets in much uglier cases). There is nothing wrong with the hardware, if you take something like Sword of Sodan and check out the 65816 based Apple II GS version vs the Amiga version (which is coded pretty much using all audio visual libraries contained in the on board rom so hardly pushing the actual hardware limits of the Amiga) you can just imagine what was possible. Games like Afterburner or OutRun were more than technically possible. My point was the talent to write the RISC code AND a pixel artist of high enough calibre never found themselves working on the same game. PS I am from the UK and I have owned an A3010 for decades and apart from the underwhelming games library and less heavy hitters in the artistic/creative software department like Lightwave or DeluxePaint's animation AND painting package totally integrated 2.5D package the machine would never be able to replace the Amiga's fighting chance against the overpriced useless DOS/Win 3.1/95 rubbish everyone seemed to buy as that was a 'safe bet'. The software sold the Amiga at the end of Commodore's life, and at the start of the Commodore Amiga's life Marble Madness perfect coin-op conversion in 1986 was a killer app to make every teenager playing on their ZX/CPC/64 want an Amiga...any Amiga....even the pig ugly cheap and nasty build quality Amiga 500 utter crap :) Zarch is an AMAZING game and Virus on ST/PC/A500 is rubbish but after that things kinda didn't improve by the time the A3010/3020 faster ARM home computer based machine format cases went on the shelved. The fast screen memory and fast CPU of even the first Archimedes on sale were well known by ACE magazine to be fast enough to manipulate the screen memory 50 times a second without running out of CPU power so like the 1987 Amiga 500 people did want it for the specs alone too.
I'll enhance this answer but please : NO ZARCH IS NOT AMAZING AT ALL !
It's been coded for the ARM1 (so there are no multiplications in the code), the filling routine is slow ... many more optimisations would have been possible.
Take a look here :
stardot.org.uk/forums/viewtopic.php?f=29&t=10313&hilit=hacker+for+zarch
with the links given to some YT videos.
See StarFighter3000 and you'll understand how Zarch doesn't even scratch the surface of what a well programmed 3d game can be on the Archies.
I originally incorrectly wrote there was MLA etc in Arm 1 - I really meant the first Arm in Archimedes machines had multiply instructions. So sorry that I missed the point being made - Zarch was coded on an Arm 1 proto then ?ARM 2 had MUL and MLA instructions but no division.(Edited and corrected - Arm1 wasn't in the first Acorn Archimedes machines)
Re: Mike Hunt - Might not have been so bad on the RT if Microsoft hadn't enforced it's pathetic "Modern UI Windows Store" ONLY requirement for third party apps, which allowed no normal windows apps, no services, no drivers to be developed by third parties. It failed not because of the ARM chip, but because of the fact that no one was allowed to port over their Win32 based apps to the platform. A rather silly mistake to make IMO, and they are repeating this yet again with Windows 10 S
Umm, Archimedes as actually nothing that amazing and RISC CPU's have gone back in CISC direction. I don't know why they even call it RISC anymore. Besides, the Amiga was the more revolutionary for the time and terribly under rated. It has influenced computing today more than the johnny-come-lately Archimedes. Acorn failed becuause it was too little, too late. In this video you make out that it was so ahead of its time when in reality it was beaten to the market by a superior competitor who you FAIL TO MENTION (because you're scared?) by years. The Amiga OCS 1000 and 500.
At 4.6 MIPS with 32 bit instructions in 1987, the processor in this thing was certainly faster than most things around at that time, especially in its price range high though it was. Some operations which would take other CPUs multiple instructions even came for free - IE move to another register a rotated or shifted value from another register in one single cycle. Another fabulous speed up is something like "ADD r0, r1, r2, lsl #3" (r0 = r1+ r2Shifted left by 3 places) a single cycle operation which would replace 3 machine instructions on a traditional processor of the time, including the 68000, which IIRC took a cycle or more for each shifted position as there was no barrel shifter. It was a thing designed for real computation, not just for playing games on. Acorn tested every processor available for their new machine, and couldn't find one good enough, so they designed their own. No mean feat for a small British company, and they should be given due credit for that at least.The OS was super friendly by the time RiscOS 2 was released. But even from the start it had just about everything needed to write a game. Basic, and Assembler. or even a mix of both. As said, The Pacmania code was written using nothing other than the inbuilt Arm assembler. Pachmania took some 2 or 3 seconds to assemble the program from its Arm source code contained in the BBC basic environment.
If it was so damn good why was it such a failure. Blah blah blabh, excuses and more.
It had no divide or multiply instruction.
Miggy4Eva Was it a failure though? When you consider the processor born out of it is now the most used processor in the world, I'd say it did pretty well. I'm a huge Amiga fan and it has a lot of nostalgic legacy, but the BBC/Archimedes projects resulted in a far greater real world legacy in ARM.
RetroManCave just because it had a great legacy, it doesn't mean it was particularly successful as a product. Look at Edison's phonograph. The first time any had ever heard their own voice. But within 20 years Edison's machines had all but gone, to be supplanted by shellac (later vinyl) records.
Shaun Hollingworth compare the 4.6 MiPs Archemides to the 386 at around 12-13MiPs and which predated it by 2 years iirc. Or just wait for the even faster 486 a few years later. And the later Archimedes had no chance outperforming a Pentium.