These product managers were excellent presenters. They even timed little side chit chats perfectly timed to the disk thrashing latency. Bob was an expert.
That was a very crazy time in the high-tech arena, especially so in Silicon Valley, as I was working in administration for a high-tech startup company in Mountain View. There was one engineering department manager that wanted his whole group on Macs for development of CAD/CAE design products for ICs. The problem was, to port-over those designs so that Apollo workstations and Digital Equipment Corp [DEC] mini-computers would recognize those designs. I found out later on that the manager was using that product development ploy to chisel down the price of the Macs from Apple direct; as he had no intention of using the Macs for development, but for his own personal use. Apple got wise to that stunt and slammed the door in the company's face for any Apple/Mac products at discounted prices.
I mean maybe they should have ... because they are worth less than nothing ... I cant name a single Mac product that isn't on pc , or work better on pc .... And pretty much since the early 2000s it's mostly been that way . The mac fan boy art fans will say it's the display , but ... mind you 120dpi to 144 dpi is standard in all flat panel monitors now . Oh it's the hard drives .... nope . Samsung has that one .... Oh it's the ram.... nope , MaaNY companies beat the snot out of apple in pretty much everything . Including on phones .
@@acidangel111a partir del año 2000 Mac comenzó a diseñar mejores sistemas operativos, a tal punto que Microsoft copió muchas características de Mac en Windows, desde sus inicios hasta el día de hoy.
@@acidangel111I worked on MANY Samsung phones. Total garbage. Also, i won’t waste my time listing apps exclusive to macOS (and no, I don’t mean GarageBand).
@@acidangel111also, what’s your deal? No one is forcing you to use a superior OS. It’s not your money, so why do strike me as one of those people who thinks they’re smarter than anyone who disagrees with them or doesn’t know some suuuuuuuuper obscure drive specs, etc?
@jimmybuffet4970 roflmao. Superior os . Try garbage at best . Whats my deal ? I don't know is this what's the deal ? Is there a door number 1 , 2 , or a big box ? I'll take the big box . Ohhhh it contains trash ? Well , still better than any mac. Fun fact.
WOW, one button Mouse.. I never tired of the Mac fans telling us how the One Button Mouse was so... vastly superiour to the Multi Button mouse on the PC.
Don't forget MacOS UI is much different than Windows, menu is always on top, there is no popup menu for selected element. Single button was to not confuse users and force them to keep in mind - you select element, and menu is always on top of the screen. There is no second button because there is no popup menu. SIngle button mouse would not make sense on Windows. Popup menu makes sense only on high resolution screen.
That was totally shoehorned in too, with no context for what they were showing. Not even a contextualizing comment from stewart to explain how this PC thing is relevant to the discussion. I guess he was just letting Gary do his little ad segment and then move it all along
For those who don’t know, the host of the show invented GEM. Yup, this was a total conflict of interest and the negative attitude toward the Mac was totally biased.
The Validec for computer ordering at the restaurant was amazing and way ahead its time. It's really fashinating and instructive to watch these shows and read Byte magazine, still nowadays.
1:52 As a software guy, I resent the idea that a ground-breaking OS with integrated graphics engine could not be “technological innovation”. Remember, this thing let you interact in real-time with on-screen drawn images using just a CPU running at 7.8MHz, with no hardware acceleration!
More like the lack of hardware was innovative. Wozniak liked to do everything in software that used to be hardware-based, which led to a lot of flexibility on the software's part.
Amiga Inc was starving for a cash infusion and went to several companies before Commodore. Apple was one of them, with Jobs declining as he found the custom chip design too complex and potentially limiting for future expansion options. This turned out to be true as the NTSC video based system really constrained high resolution video output, even till this day. And the cost of further custom chip development was predicted to be costly as well, another prediction held true.
@SteelRodent While true, it ignores a huge part of what makes Apple successful - they took a system that was interesting but impractical, and tuned it for consumers (and businesses to an extent). That’s what people always miss when they talk about Apple: they always complain that Apple isn’t innovative, but they actually are - the innovation comes from design refinement, which is arguably just as important and difficult as the initial idea. Did that make it groundbreaking? Yes.
It certainly was a remarkable software achievement, but what the Atari ST and Commodore Amiga demonstrated was that there were significant gains from better hardware for a graphic task-set. Ex-Mac system engineers actually founded Radius for exactly that reason.
18:43 The first LaserWriter we got, took so long to image graphics pages, that I named it “Deep Thought”. It was built around the legendary Canon LBP-CX engine, also used in HP LaserJets of the time. That thing could really cope with abuse--i.e. high print loads.
Reminds me of the LaserJet 4M retired from my father's job at ComputerLand. We managed to wrap the lifetime page count back around to zero before the main motor finally gave out.
I got to work with Jazz and the LaserWriter. The Mac plus and a lot of other computers when I was 23. I worked at Computerland my first real job. The laser writer was over $10,000 in 85 dollars.
I've gotta smile at Paul Schindler's look into the future of the mac as I sit here at the end of 2021 watching this old vid on an iMac, running an OS whose roots are the result of Jobs' NeXT adventure, with a beautiful color screen on the internet. This really isn't a put down of Paul. He honestly looked at the landscape of the industry at the time and made a reasonable prediction. In fact, many mac users (a more loyal group you can't find) in the late 80's and 90's went through a period of feeling abandoned by Apple. You just never know that's coming tomorrow.
He's right, even to this day the Macintosh is not taken seriously in the business world. After the iPhone became popular some smaller businesses have started using iMacs and theres always been a place for macs in creative and artistic endeavors. But the PC, pioneered by IBM and Microsoft has always dominated the overall business market
@@JaredConnell I don't wanna get into a religious war over MAC vs PC But I would ask you to look at Paul's reasons for discounting the Mac. He claims it's (the Mac) is flawed? Is it? How? He claims it's too expensive. True it's initially more expensive than a PC but it also has a longer life cycle, it's easier to use and has better security. He claims it's not fast. That's clearly no longer true. I just got my wife a new Mac with the M1 chip. It's blazingly fast. And I'm using an iMac which has the Intel chip in it and I still get great speed. It's got a small screen. Today that's laughable! It's in Black and White. Uh huh . . . seen the Retina display?? So bottom line, none of Paul's points as to why he rejected the Mac back in the 80's are valid today. And as far as it not being taken seriously, I guess that depends on the business you are in. If you do graphics and or video work it certainly is taken seriously. Or if you are an IOS developer, it's hard to find a better platform to work on.
@@Jim-mn7yq He was correct. Apple nearly went bankrupt in the 90s. Additionally, apple had two competing product lines, the Apple ii and upgrades, and the Mac. As to your points about apple nearly 40 years in the future... Mac's don't last as long as they are much harder to upgrade in hardware, though recently they are a bit easier now. Even so, they often require mac specific components, as opposed to generic components, meaning it is much more expensive. Additionally, old mac's can't upgrade to the newest MacOS, which hinders their software support and leaves them less secure. The brand new M1 has been shown to be slower than say, the most new AMD chipset and graphic card, for intensive tasks (transistor count is a physical limit you can't get around). The m1 based on ARM is far more efficient than x86-64, and this also makes it speedier for little, immediate, things. As for screens, for the same price, the apple monitor will usually be worse than a generic monitor. What 27"+ 4k retina monitors can you buy for under $300? Also, if it wasn't just for the iphone, apple as a computer company wouldn't really exists as it does today (it still would exist, but really the iphone is propping it up immensely).
@@lelsewherelelsewhere9435 "Apple’s market cap recently surpassed $2 trillion. A historic moment in the history of the U.S. stock markets. It is the first company in history to accomplish this making it the largest and most valuable publicly traded company in the U.S. It passed the $1 trillion mark only two years ago in 2018 after being a publicly traded company for 38 years. Phenomenally, it only took a mere two additional years to increase another trillion dollars in value." Need I say more?
@@Jim-mn7yq Well, yeah, current day Macs are a completely different product than the Macs back then. Of course his comments don't hold up if you apply them to current Macs. But the Macintosh he was reviewing had the flaws he went through and stated, and true to his prediction, it never was taken seriously in the business world.
When I had originally seen this episode of the Computer Chronicles back in 1985 I wanted to get a Macintosh computer, but I soon discovered that Apple both priced them out of my price range and that there weren't any Apple stores in my hometown. I ended up getting myself a Windows PC just 12 years later instead.
I volunteered for their afternoon recruitment drive for drone "workers" , who thought "Coowell" .......was something then...enough for me then. "It is now" - Thanks guys!
I am still using a 1984 Macintosh to this day. I regularly go online with it on Bulletin Boards and communicate with people. It’s the early internet and it runs very well on these old systems. Hard to believe my Macintosh was around when this episode was being filmed... all before I was even born.
@@Bendaak what bbs services are you dialing? how are you dialing them on the current digital phone networks. You say your doing this but you provide no explanation nor proof of how you are doing this.
@@jessihawkins9116 I am not dialling via phone line however it is still possible today as there are BBS services available such as Level-29 which can be authentically dialled into. I connect via TELNET using a RS232 WiFi serial modem. This is the most common method of connecting to a network on a vintage machine.
If you follow Bil Herd on TH-cam, he was the designer of that computer and not pleased that they never released it. He said that Commodore didn't see any future in portable computers!! I'm pretty sure he has one of the few prototypes that were made, and he even got his unit booting on live stream he did a while back.
@@DavePoo2 It was scratched due to fact that Commodore bet all its money on Amiga and after Amiga A1000 fiasco Commodore almost went bankrupt, so there was no space for incompatible portable C64 anymore.
I haven't actually tried one myself (even in emulation) - I'm kind of curious about the extent to which this is actually true. Like, for instance, I used to run GEOS on a Commodore 128, like all the time. Lower overall screen resolution than a Mac 128K, slower CPU, technically a little more RAM since the display controller had its own 16kB... Better floppy drives (I had a 1581) - It was slow, of course, but I found it useful. Low bar perhaps but I have to imagine the Mac 128 was at least better than GEOS.
@Tetsujin - I own an original 1984 Macintosh (albeit upgraded by its former owner to a 512Ke with 1 MB RAM) and can tell you the above statement is accurate, it was basically a glorified "tech demo". It wasn't until the Macintosh Plus with its 1 MB RAM (expandable to 4 MB RAM), 800K floppy, SCSI hardisk support, proper keyboard (cursor-control!) and larger ROM firmware (more and better optimized Toolsets) did the Macintosh actually become usable and viable. Some might even argue it wasn't until the Macintosh II, more than a year later, with its modular/expandable design and color support. And yes, I'm familiar with GEOS on the C64 and Apple IIe, but those were 8-bit machines, running with low graphical screen resolutions and very limited memory constrictions. The Mac had a 16-bit CPU, could address far more memory (and linear, not bank switching) and fixed screen resolution, with almost double the number of horizontal and vertical pixels to push around. It was far under equipped to handle all this. Apple made the same mistake with the 16-bit Apple IIGS, with only shipping with 256K RAM, but at least it was user expandable up to 8 MB RAM. FYI, between its limitations and high price,the Mac was never even considered as a home computer until the early 90's, with the introduction of the Macintosh LC. However, but that point it was far too little, and too late. The DOS and Windows PC became a much better choice (I still think the Apple IIGS could've kept Apple in the home market game, given the chance, but they threw that opportunity away in favor of the lesser Macintosh).
@@Apple2gs it is simple, RAM was very expensive back then. Even Atari ST was planned with 128kB of RAM, same as Amiga was. In late 1984 RAM prices started to fall and Atari dropped the 128kB version in favor of 512kB and Amiga got 256kB as a baseline. It was in same time Apple introduced Mac 512. Even DOS computers in 1984 usually shipped with 128-256k RAM and 512k was seen as luxury. IBM PCjr had whopping 64k or 128k RAM. The RAM was expensive in 1984, Im telling you...
"low-hang·ing fruit nounINFORMAL a thing or person that can be won, obtained, or persuaded with little effort. "we know mining our own customer base is low-hanging fruit""
When the guy said it replaces the "A Greater Than" I literally had to play it again to figure out wtf he was talking about. Oh, like A:> prompt. I guess they didn't quite have terminology figured out back in the 80s.
@@andreiandrosoff1327 "Eh Colon Prompt" (i wouldn't verbalize the greater than symbol.) Or just A-prompt. Or C-prompt or whatever. That seems to be what was commonly used in the 90s which is when I really dug deep into computers originally.
His prediction was true - Steve Jobs was fired, Apple II was their most popular machine, they lost money on Lisa, Mac wasn't proffitable you could not make use until was upgrated to 512k, it didn't age well but it took few years for Mac to actually useful
From what I understand, this was the beginning of a time of tumult at Apple, such that years later, the company brought back Steve Jobs, getting that "weird Unixy NeXT thing" in the process.
@25:33 A shame this Commodore laptop never came out. They had some key LCD patents owned but then decided LCD would lead to a path of nowhere and sold that off.
imagine having to be the guy pretending macintosh is not a piece of crap compared to the Apple IIgs which was deliberately slowed down because Steve Jobs had a petty ego
@Luke He looked visibly uncomfortable to me! Also, let's not get into an argument about subjective versus objective reality here, okay? You're trying to tell me that the software applications made up for it but they could have made the same software, but better (with color) for the Apple IIgs. Ego gets into marketing big time. I know this because there are many business and sales models in operation right now that don't make any sense. Take for example the business of collectable miniatures: here we have a gaming company with a business model created by gamers, not businessmen. Just go check out that crapshow and you'll see what I mean. It's basically gambling.
@@quonomonna8126 Also, let's not forget the price, the Apple IIgs still beat the Mac in many areas even when it was slowed down because of Jobs' insistence on crippling it, it had gs/os, it was in color, it had a faster cpu, and at the time it had more software and it had more games too, the Apple II team took the idea of the Mac and executed it way better. It's been confirmed many times that Steve hated the Apple II because he thought it was older tech and the Mac was the future, so when the Apple II team executed his vision better, he was angry. Even years later, at the introduction of the iMac, Wozniak wanted him to give a shoutout to the Apple II team because they were one of the main reasons that the company was alive at the time, he wouldn't do it.
@@quonomonna8126 Yes yes, this was mainly aimed at Luke even though I forgot to tag him. It's just sad that so many people forget that what a dirty start the Mac had.
Jobs left Apple a full year before the IIgs came out. And bear in mind you're kind of conflating two versions of the Apple IIgs here: the one that existed in the real world in 1986 (a year after this episode of Computer Chronicles aired) and sold with a bare minimum of memory, no display, and no disk drives for $1000 - and the imaginary version that lives in the minds of IIgs fans, the version that would have run at 8MHz or something because of that one time Woz said it'd be awesome to have a machine built around an 8MHz '816. The latter does not exist in reality. We don't know what kind of IIgs we could have got, and what that hypothetical machine would have cost, because it was never made. The closest thing we have is IIgs accelerator cards. Get a *really* good one of those, and a IIgs can *almost* keep up with an Amiga 500. The IIgs that exists in the real world has more wrong with it than a slow CPU. It has a slow *architecture*, because a lot of its architecture is Apple IIe architecture, stuck not only at an 8-bit data bus but also locked to 1MHz. The video RAM is part of that, in IIgs super-hi-res mode you get one frame buffer, which means no page flipping, and it's in slow RAM. Combined with no hardware sprites or background scrolling it really creates a bad situation for games. The best case as I understand it is to double-buffer and use the IIgs's RAM shadowing, triggered with a technique like "PEI slamming" when it's time to update the display RAM - If I got the math right, means you'd spend around 35ms updating the display from your double buffer, so you could maybe run the game at 15fps and only spend about *half* your overall CPU time telling the hardware to update video RAM from the double buffer! Or you can make "Arkanoid" or something where you only have to update smaller areas of the display on each frame. If Apple management really kneecapped the IIgs, Woz should have signed it "Alan Smithee" instead of "Woz". (Though it is nice the guy got to sign one a machine in the Apple II line) I think its problems run much deeper than CPU speed, it's baked into the whole design. Look at the SNES for instance. It has a faster CPU than the IIgs but not by much (3.58MHz vs. 2.8MHz) but it is *way* better at moving graphics around the screen. That's because, like most good game systems of the era, it's all about the rest of the chipset - hardware support for layered scrolling playfields and sprites makes a *huge* difference. I agree that the early Macintosh machines were fairly limited. I think those limitations were a necessary compromise at that point in the platform's history. It was slow enough as it was, frankly, and going color would have made it slower - or more expensive. What it offered instead was a high-resolution display (at least, higher than most color displays at the time) with relatively low RAM and CPU requirement. Later, more powerful Macs (1987 onward, Macintosh II line, etc.) were more powerful but with a price to match. Machines like the Amiga, Atari ST, etc. at the time were a good middle-ground, reasonably affordable but reasonably powerful, and that's what the IIgs should have been - but it really fell short of that IMO. We can imagine what if maybe it hadn't... but that is not how it turned out in reality.
Practically everything he says is completely wrong, but he's at least more interesting than George Morrow. In Morrow's commentary segments he just says some obvious stuff every time
I'm just glad the operating system was still called "System" then. I don't know if I could handle somebody on Computer Chronicles calling it "May Koss".
The mathematician daughter of one of Britain's most talked-about poets figured out how to make that difference engine-had it been fully built-grind out "Bernoulli numbers".
Paul schindler was right here, there is very little mac os use in business these days, even Chrome OS and Linux are in heavier use. ios is different though
@@mcswabin207 The trackball is smaller than the amount of space a mouse would need to operate. Apparently this concept is too much of a stretch for you to understand.
Gee, it had both colors... Black and white... and low end sound. The Atari ST and the Amiga came out later in 85. 4 channel Sound, Fully Pre emptive OS, 4096 colors... yep.. the Amiga was a Dinosaur campared to the Mac.. (Geez the Mac did not have a Fully Pre emptive OS till 2000and what?
Neither allowed, nor disallowed - it was out of scope. Multi-tasking is something an operation system has to support (or not). GEM by itself was just graphical operating environment and widget toolkit that could be integrated into various products. What we see here are individual DOS apps using GEM for UI: a file manager, a picture editor. However, the TOS operating system of Atari computers used GEM. TOS wasn't initially truly multi-tasking, but its later developments were.
Computer Chronicles got a lot of things right, but one thing they got wrong was how much time they spent talking about Top View and it was ultimately a complete flop. IBM was a market leader but they had really already stopped innovating in any significant way.
Yep, but you have to remember that IBM totally rewrote the microcomputer world at that moment in time. In just 3 years it was IBM or compatible as a standard. Everybody expected that it would be IBM to define GUI on the PC.
@23:55 "it's too expensive" 2025 still too expensive. The fruit company was able to convince a section of the population that expensive means great product.
The Mac of today has nothing in common with the Mac of the 80s. Back then they used different processors, had a completely different system architecture than PCs, now everything is almost entirely the same.
Jobs took Raskin's design for an affordable computer... and he screwed it up. Xerox knew a personal computer powerful enough to run a GUI would be hideously expensive at that time... and that's why they didn't market the Alto. See also: Canon Cat, Xerox PARC
What have we done with this awesome technology? We've enabled perpetual outrage mobs and #metoo and stupid cat videos! NOBODY could have foreseen how badly technology would be abused to fuel outrage culture, though a lot of people could see the loss of privacy that was coming.
Why did Apple want into business instead of gaming? Now Windows is the gaming platform and it responsible for so much more spending than business applications. I drop thousands per year upgrading and buying gaming hardware and software. We only spend a small fraction of that at my business for productivity applications.
that’s a fair question. First, Jobs had used psychedelics and knew that the computer would push the human race forward (his words). He probably spent very little time gaming himself. Second, Jobs left Apple to found NeXT which was a computer for educational institutions (again, to push the human race forwards). Gaming is a timehole and he probably didn’t see it as a life well lived. Third, the Mac did have games. Tetris comes to mind.
@@blackrockcity I reserve gaming for snowy cold days, but really enjoy it. I do tend to think that I could be doing something productive instead and the older I get the less time I spend gaming.
Hugely overpriced and hugely underpowered, as Apple products so often tend to be. I guess it deserves credit for bringing mouse-driven GUIs to the masses? Just one year later the Amiga came out and completely obliterated this thing in every possible regard, while costing a lot less
Of course looking back more than 30 years later, things are different today, but this show was shot in 1985, and that's the lense it was intended to be looked at through. You *did* hear where he listed the reasons *of the time*, right? Small (9 inch diagonal) black & white screen, too expensive, etc. Add to that that it only had 128KB of RAM, which was small even for its day, no hard drive, only 400 KB floppy disks, etc.
@@SweetBearCub Well, technically the prediction stood its ground. The PC market ended up swallowing up 90% of the marketshare of all computers ever sold. The Mac held a niche market for things, but big corporate companies that started putting a computer on every desk for their employees in the late 80s never went to Apple.
For desktop computers and laptops Macs are still a small part of the market. The install base of windows/linux based PCs is many times the size of the mac base. A major factor is the pricing(You can get a budget computer for as little as $300), but also because people are familiar with using a PC with windows and MS office at work (Macs are mostly used in companies that do professional graphics or audio). And of course the majority of gaming PC's are windows machines because a lot of titles, especially in indi games, are not Mac or Linux compatible (though Linux compatibility is a lot more common now then 10 years ago). (If you don't believe me check this page on market share of different OSes per month for 2013-2019: www.statista.com/statistics/218089/global-market-share-of-windows-7/ )
The Mac-as-computer has almost always been a home-user or student product, with some small volume niche markets like desktop publishing and video that just aren't necessary in any-way at all to use a Mac now. Plus the last good computer they made was the IIci.
The mac was a waste of time, they would have done better by not restricting the Apple 2 GS which had color capable and backwards compatibility with the apple 2 . Did I mention the color screen? , I contend that if Steve jobs hadn't spayed the capacity in the GS Ram then the GS would have been super super better. The MAC was a downgrade not an upgrade
Backwards compatibility itself kneecapped the IIgs before it even got out of the gate. And Jobs wasn't at Apple when the IIgs came out. So for instance, IIgs fans always moan about how terrible it is that Apple limited the CPU to 2.8MHz to avoid competing with Mac. But there's another, worse limitation baked into the IIgs architecture: the graphics memory (including the IIgs graphics modes) is part of the 1MHz "compatibility" core of the machine - so the slow CPU isn't even the limiting factor here: as slow as the CPU is, the video RAM is *slower*. In other late 80s platforms like Amiga and Atari ST, the CPU was supplemented by the chipset, which could ease the burden on the CPU. Even if the IIgs CPU had been exactly as fast as the Amiga CPU, the lack of this kind of support hardware meant it was forever limited by that display RAM bottleneck. I contend that the overall design of the IIgs (resulting primarily from choices made for backward compatibility) made it an underperforming, dead-end design mired in almost decade-old technology. On paper the system could look competitive with other late-80s machines like Amiga and Atari ST but couldn't back that up with real-world performance for a variety of reasons. I really think Woz was wrong on this one: Carrying forward the Apple II design was not a good plan for the future. But remember also the Mac came out before the IIgs. The original Mac was double the price of a comparable Apple II (A complete Apple IIc, with monochrome display, was $1300 in 1984, while the 1984 models of Macintosh were $2500-$2800) - but it had a faster CPU, wider memory bus, a higher-capacity disk drive (400K on the first Mac models, vs. 140K on the Apple II), and a higher-resolution display (512 x 342 vs. Apple II's 560 x 192 double hi-res, or the more typical 280 x 192 hi-res). The Apple II could display color, but it wasn't really good at it (typically 6 colors at 140 x 192, though some software used a 15 color 140 x 192 mode) so I don't count that as much of an asset. Mac was more expensive but it was a better machine. Then when the GS came out - shortly afterward the Mac platform got a major upgrade with the Macintosh II (including color graphics!) - but of course that was a much more expensive machine as well, around $5500. The IIgs started at $1000 (approx $1500 for a minimal complete system with one disk drive and a monitor) and the first model was shipped with a bare minimum of RAM (256kB, half of which was "slow RAM" in the 1MHz Apple IIe core) So at that point the price gap had widened to a factor of 4 or so, but Mac was still the better machine in my opinion. (If you wanted a better machine *and* a better price, you could get an Atari ST... Or an Amiga 500 once that came out.) The Apple II was a great machine for its time (i.e. late 1970s to early 1980s). But its real strengths were in the clever engineering hacks that allowed Apple to make such a great combination of power and economy. But those kinds of hacks don't really translate into a good long-term foundation for a computer platform. As it was, the Apple II (particularly the IIe) carried Apple through the 1980s, long enough for them to turn Macintosh into a platform that could carry them through the 1990s. I think they did about as well as they could have possibly hoped to have done, being one of very few prominent 1980s computer brands to still be relevant in the early 2000s (and the only one of the 1977 "big three" of Commodore, Atari, and Apple to even survive that long) - to say nothing of today, *another* 20 years later! I really doubt they would have done better than that by sticking with Apple II as the foundation for their machines.
I like how you can still hear the loud grumbling of the mac's 400k drive even when Bob Foster is trying to talk during his bit
yeah he’s dumb. he should’ve bought a gotek drive to go with it. he would’ve saved a ton on disks
Man, those 80s gentlemen sure knew how to tie a perfect tie knot. Very stylish
well buddy up with them and they can teach you there secrets
These product managers were excellent presenters. They even timed little side chit chats perfectly timed to the disk thrashing latency. Bob was an expert.
"They're both overadvertised shiny boxes whose contents disappoint!"
Damn.
"McWrite and McPaint" ???? Steve must have went nuts when he heard that.
Apparently MacPaint was going to be called "Mackelangelo". 👀
it’s called MacWrite and MacPaint 🙄
That was a very crazy time in the high-tech arena, especially so in Silicon Valley, as I was working in administration for a high-tech startup company in Mountain View.
There was one engineering department manager that wanted his whole group on Macs for development of CAD/CAE design products for ICs. The problem was, to port-over those designs so that Apollo workstations and Digital Equipment Corp [DEC] mini-computers would recognize those designs.
I found out later on that the manager was using that product development ploy to chisel down the price of the Macs from Apple direct; as he had no intention of using the Macs for development, but for his own personal use.
Apple got wise to that stunt and slammed the door in the company's face for any Apple/Mac products at discounted prices.
I mean maybe they should have ... because they are worth less than nothing ... I cant name a single Mac product that isn't on pc , or work better on pc ....
And pretty much since the early 2000s it's mostly been that way .
The mac fan boy art fans will say it's the display , but ... mind you 120dpi to 144 dpi is standard in all flat panel monitors now .
Oh it's the hard drives .... nope . Samsung has that one ....
Oh it's the ram.... nope , MaaNY companies beat the snot out of apple in pretty much everything .
Including on phones .
@@acidangel111a partir del año 2000 Mac comenzó a diseñar mejores sistemas operativos, a tal punto que Microsoft copió muchas características de Mac en Windows, desde sus inicios hasta el día de hoy.
@@acidangel111I worked on MANY Samsung phones. Total garbage. Also, i won’t waste my time listing apps exclusive to macOS (and no, I don’t mean GarageBand).
@@acidangel111also, what’s your deal? No one is forcing you to use a superior OS. It’s not your money, so why do strike me as one of those people who thinks they’re smarter than anyone who disagrees with them or doesn’t know some suuuuuuuuper obscure drive specs, etc?
@jimmybuffet4970 roflmao. Superior os . Try garbage at best .
Whats my deal ? I don't know is this what's the deal ? Is there a door number 1 , 2 , or a big box ? I'll take the big box .
Ohhhh it contains trash ? Well , still better than any mac.
Fun fact.
WOW, one button Mouse.. I never tired of the Mac fans telling us how the One Button Mouse was so... vastly superiour to the Multi Button mouse on the PC.
you only need one button. you use the shift key.
@@jessihawkins9116if you need to use a keyboard key then wouldn't it be much easier to just have another button?
one button enabled the mouse to be used by the disabled. Bet you never thought of that!
Don't forget MacOS UI is much different than Windows, menu is always on top, there is no popup menu for selected element. Single button was to not confuse users and force them to keep in mind - you select element, and menu is always on top of the screen. There is no second button because there is no popup menu. SIngle button mouse would not make sense on Windows. Popup menu makes sense only on high resolution screen.
@blackrockcity Indeed. The Macintosh is the perfect computer for the mentally disabled.
Wow. Gary (owner of Digital Research) really seemed to like GEM, by Digital Research. Interesting. ;-)
That was totally shoehorned in too, with no context for what they were showing. Not even a contextualizing comment from stewart to explain how this PC thing is relevant to the discussion. I guess he was just letting Gary do his little ad segment and then move it all along
Well, GEM is excitement!
he invent the stuff bruh
@@jessihawkins9116 Yep, that is why it is funny... :-)
For those who don’t know, the host of the show invented GEM.
Yup, this was a total conflict of interest and the negative attitude toward the Mac was totally biased.
I love watching this old shit!
Sometimes it looks like the 80's happened 50 years before the 90's. Everything about the 80's looks and feels ages older than the 90's.
bruh what
Moore's law
The 70’s looked like 1832
This comment reminds me of computers in the Italian Renaissance. Big Vibes.
1970’s looks like 90’s Unix looks like 2020’s Unix 🤷♂️
The Validec for computer ordering at the restaurant was amazing and way ahead its time. It's really fashinating and instructive to watch these shows and read Byte magazine, still nowadays.
1:52 As a software guy, I resent the idea that a ground-breaking OS with integrated graphics engine could not be “technological innovation”. Remember, this thing let you interact in real-time with on-screen drawn images using just a CPU running at 7.8MHz, with no hardware acceleration!
In that statement he clearly separates hardware and software. The hardware was not innovative in any way.
More like the lack of hardware was innovative. Wozniak liked to do everything in software that used to be hardware-based, which led to a lot of flexibility on the software's part.
Amiga Inc was starving for a cash infusion and went to several companies before Commodore. Apple was one of them, with Jobs declining as he found the custom chip design too complex and potentially limiting for future expansion options. This turned out to be true as the NTSC video based system really constrained high resolution video output, even till this day. And the cost of further custom chip development was predicted to be costly as well, another prediction held true.
@SteelRodent While true, it ignores a huge part of what makes Apple successful - they took a system that was interesting but impractical, and tuned it for consumers (and businesses to an extent).
That’s what people always miss when they talk about Apple: they always complain that Apple isn’t innovative, but they actually are - the innovation comes from design refinement, which is arguably just as important and difficult as the initial idea.
Did that make it groundbreaking? Yes.
It certainly was a remarkable software achievement, but what the Atari ST and Commodore Amiga demonstrated was that there were significant gains from better hardware for a graphic task-set. Ex-Mac system engineers actually founded Radius for exactly that reason.
That 16:40 "okay lee thanks" should become a hiphop or techno sample. Make it so!
Larry Tesler describing his future vision of agents doing our work is priceless @ 22:45
18:43 The first LaserWriter we got, took so long to image graphics pages, that I named it “Deep Thought”.
It was built around the legendary Canon LBP-CX engine, also used in HP LaserJets of the time. That thing could really cope with abuse--i.e. high print loads.
And yet it had A CPU running faster than the Mac that was driving it!
Ya, but Apple stole GUI from Xerox Alto
@@mikcnmvedmsfonotekaOh please… Read about what Larry Tesler, Andy Hertzfeld, and Bill Atkinson invented.
Reminds me of the LaserJet 4M retired from my father's job at ComputerLand. We managed to wrap the lifetime page count back around to zero before the main motor finally gave out.
0:30 Whoah! that Mac is massive! they really shrunk it down over the years
I got to work with Jazz and the LaserWriter. The Mac plus and a lot of other computers when I was 23. I worked at Computerland my first real job. The laser writer was over $10,000 in 85 dollars.
I didn’t expect to see Kate Winslet in this!
who’s that?
@@Xenotypic Kate winslet
@@jessihawkins9116 perhaps i was being a tad rude, i had just woken up. sorry bout that
@@Xenotypic who is she?
@@jessihawkins9116 think she played the part of johanna hoffman in the steve jobs movie
Paul shilnder says the mac don't work and it sucks and a few years later he is showing off mac software in future episodes.
literally everything he said each episode turned out to be wrong, but at least he was self-assured lol.
He makes a huge point about not owning an IBM compatible and not ever going to have one... And then became editor of windows magazine.
He underestimated the stupidity of the average Apple consumer.
"It enables you to stare at a tiny little portable TV and imagine how great computers will be one day. But not yet."
It's almost like it took decades of innovation and effort to enable people to swipe through Facebook, Twitter and Instagram all day.
08:45 Did he just switch it off and on again? Didn’t they have the Programmer’s Switch installed? Just hit reset!
Has Paul Schindler ever had an opinion about some thing that turned out to be correct?
Honestly had the Spindler era not happened, I'm sure things would have turned out differently.. but the late 90s when Jobs came back it was too late.
Joanna Hoffman is on this episode.
I've gotta smile at Paul Schindler's look into the future of the mac as I sit here at the end of 2021 watching this old vid on an iMac, running an OS whose roots are the result of Jobs' NeXT adventure, with a beautiful color screen on the internet. This really isn't a put down of Paul. He honestly looked at the landscape of the industry at the time and made a reasonable prediction. In fact, many mac users (a more loyal group you can't find) in the late 80's and 90's went through a period of feeling abandoned by Apple. You just never know that's coming tomorrow.
He's right, even to this day the Macintosh is not taken seriously in the business world. After the iPhone became popular some smaller businesses have started using iMacs and theres always been a place for macs in creative and artistic endeavors. But the PC, pioneered by IBM and Microsoft has always dominated the overall business market
@@JaredConnell
I don't wanna get into a religious war over MAC vs PC
But I would ask you to look at Paul's reasons for discounting the Mac.
He claims it's (the Mac) is flawed?
Is it?
How?
He claims it's too expensive. True it's initially more expensive than a PC but it also has a longer life cycle, it's easier to use and has better security.
He claims it's not fast. That's clearly no longer true. I just got my wife a new Mac with the M1 chip. It's blazingly fast. And I'm using an iMac which has the Intel chip in it and I still get great speed.
It's got a small screen. Today that's laughable!
It's in Black and White. Uh huh . . . seen the Retina display??
So bottom line, none of Paul's points as to why he rejected the Mac back in the 80's are valid today.
And as far as it not being taken seriously, I guess that depends on the business you are in. If you do graphics and or video work it certainly is taken seriously. Or if you are an IOS developer, it's hard to find a better platform to work on.
@@Jim-mn7yq He was correct. Apple nearly went bankrupt in the 90s.
Additionally, apple had two competing product lines, the Apple ii and upgrades, and the Mac.
As to your points about apple nearly 40 years in the future...
Mac's don't last as long as they are much harder to upgrade in hardware, though recently they are a bit easier now. Even so, they often require mac specific components, as opposed to generic components, meaning it is much more expensive.
Additionally, old mac's can't upgrade to the newest MacOS, which hinders their software support and leaves them less secure.
The brand new M1 has been shown to be slower than say, the most new AMD chipset and graphic card, for intensive tasks (transistor count is a physical limit you can't get around). The m1 based on ARM is far more efficient than x86-64, and this also makes it speedier for little, immediate, things.
As for screens, for the same price, the apple monitor will usually be worse than a generic monitor. What 27"+ 4k retina monitors can you buy for under $300?
Also, if it wasn't just for the iphone, apple as a computer company wouldn't really exists as it does today (it still would exist, but really the iphone is propping it up immensely).
@@lelsewherelelsewhere9435 "Apple’s market cap recently surpassed $2 trillion. A historic moment in the history of the U.S. stock markets. It is the first company in history to accomplish this making it the largest and most valuable publicly traded company in the U.S. It passed the $1 trillion mark only two years ago in 2018 after being a publicly traded company for 38 years. Phenomenally, it only took a mere two additional years to increase another trillion dollars in value."
Need I say more?
@@Jim-mn7yq Well, yeah, current day Macs are a completely different product than the Macs back then. Of course his comments don't hold up if you apply them to current Macs. But the Macintosh he was reviewing had the flaws he went through and stated, and true to his prediction, it never was taken seriously in the business world.
When I had originally seen this episode of the Computer Chronicles back in 1985 I wanted to get a Macintosh computer, but I soon discovered that Apple both priced them out of my price range and that there weren't any Apple stores in my hometown. I ended up getting myself a Windows PC just 12 years later instead.
The Xerox alto has a pretty impressive screenresolution.
this video is about the Macintosh
the alto is as big as my clothing drawer
I volunteered for their afternoon recruitment drive for drone "workers" , who thought "Coowell" .......was something then...enough for me then. "It is now" - Thanks guys!
I am still using a 1984 Macintosh to this day. I regularly go online with it on Bulletin Boards and communicate with people. It’s the early internet and it runs very well on these old systems. Hard to believe my Macintosh was around when this episode was being filmed... all before I was even born.
no you aren’t
@@jessihawkins9116 wut?
@@Bendaak what bbs services are you dialing? how are you dialing them on the current digital phone networks. You say your doing this but you provide no explanation nor proof of how you are doing this.
@@jessihawkins9116 I am not dialling via phone line however it is still possible today as there are BBS services available such as Level-29 which can be authentically dialled into. I connect via TELNET using a RS232 WiFi serial modem. This is the most common method of connecting to a network on a vintage machine.
@@Bendaak ha 😆 you just told on yourself right there. Macs have RS422, not RS232. You made it all up. 😐
Thank God they invented barber shops since 1985.
What’s a barber shop?
"McWrite and McPaint" lmao
Careful, McDonald’s might sue.
25:32 - Ah, the Commodore LCD portable! I always thought they had actually brought this to market but I guess they never made it happen...
If you follow Bil Herd on TH-cam, he was the designer of that computer and not pleased that they never released it. He said that Commodore didn't see any future in portable computers!! I'm pretty sure he has one of the few prototypes that were made, and he even got his unit booting on live stream he did a while back.
@@DavePoo2 It was scratched due to fact that Commodore bet all its money on Amiga and after Amiga A1000 fiasco Commodore almost went bankrupt, so there was no space for incompatible portable C64 anymore.
The original 128k mac was really more of a tech demo than an actually usable computer. Sort of a "wouldn't it be neat if".
I haven't actually tried one myself (even in emulation) - I'm kind of curious about the extent to which this is actually true.
Like, for instance, I used to run GEOS on a Commodore 128, like all the time. Lower overall screen resolution than a Mac 128K, slower CPU, technically a little more RAM since the display controller had its own 16kB... Better floppy drives (I had a 1581) - It was slow, of course, but I found it useful. Low bar perhaps but I have to imagine the Mac 128 was at least better than GEOS.
the 512k and Macintosh Plus were better machines
@Tetsujin - I own an original 1984 Macintosh (albeit upgraded by its former owner to a 512Ke with 1 MB RAM) and can tell you the above statement is accurate, it was basically a glorified "tech demo". It wasn't until the Macintosh Plus with its 1 MB RAM (expandable to 4 MB RAM), 800K floppy, SCSI hardisk support, proper keyboard (cursor-control!) and larger ROM firmware (more and better optimized Toolsets) did the Macintosh actually become usable and viable. Some might even argue it wasn't until the Macintosh II, more than a year later, with its modular/expandable design and color support.
And yes, I'm familiar with GEOS on the C64 and Apple IIe, but those were 8-bit machines, running with low graphical screen resolutions and very limited memory constrictions. The Mac had a 16-bit CPU, could address far more memory (and linear, not bank switching) and fixed screen resolution, with almost double the number of horizontal and vertical pixels to push around. It was far under equipped to handle all this. Apple made the same mistake with the 16-bit Apple IIGS, with only shipping with 256K RAM, but at least it was user expandable up to 8 MB RAM. FYI, between its limitations and high price,the Mac was never even considered as a home computer until the early 90's, with the introduction of the Macintosh LC. However, but that point it was far too little, and too late. The DOS and Windows PC became a much better choice (I still think the Apple IIGS could've kept Apple in the home market game, given the chance, but they threw that opportunity away in favor of the lesser Macintosh).
@@Apple2gs it is simple, RAM was very expensive back then. Even Atari ST was planned with 128kB of RAM, same as Amiga was. In late 1984 RAM prices started to fall and Atari dropped the 128kB version in favor of 512kB and Amiga got 256kB as a baseline. It was in same time Apple introduced Mac 512. Even DOS computers in 1984 usually shipped with 128-256k RAM and 512k was seen as luxury. IBM PCjr had whopping 64k or 128k RAM.
The RAM was expensive in 1984, Im telling you...
next generation white and black when amiga and atari 520 st were multicolor
Neato. Atari is going to release a 10MB hard drive for only $600!
That was the times.
If you low level format it with a RLL controller you might be able to get 15 or 20 mb !!!!
And a 500MB CD-ROM for $500.
"low-hang·ing fruit
nounINFORMAL
a thing or person that can be won, obtained, or persuaded with little effort.
"we know mining our own customer base is low-hanging fruit""
[ laughs in solid state ]
When the guy said it replaces the "A Greater Than" I literally had to play it again to figure out wtf he was talking about. Oh, like A:> prompt.
I guess they didn't quite have terminology figured out back in the 80s.
Out of curiosity, how would you read "A:>" out loud?
@@andreiandrosoff1327 "Eh Colon Prompt" (i wouldn't verbalize the greater than symbol.)
Or just A-prompt. Or C-prompt or whatever. That seems to be what was commonly used in the 90s which is when I really dug deep into computers originally.
bruh what
@@andreiandrosoff1327 lol right
you do realize that DRI was the very company that "invented" A:> prompt for micros, so they have full right to read it as they like, I think.
I wonder what the universe would look like if Paul Schindler was right on all his predictions :)
Predicitng the future can be a fools errand. Sorry Paul, you didn’t get it right.
His prediction was true - Steve Jobs was fired, Apple II was their most popular machine, they lost money on Lisa, Mac wasn't proffitable you could not make use until was upgrated to 512k, it didn't age well but it took few years for Mac to actually useful
From what I understand, this was the beginning of a time of tumult at Apple, such that years later, the company brought back Steve Jobs, getting that "weird Unixy NeXT thing" in the process.
22:43 Almost 40 years later, we just have AI agents that confidently lie to us.
@25:33 A shame this Commodore laptop never came out. They had some key LCD patents owned but then decided LCD would lead to a path of nowhere and sold that off.
imagine having to be the guy pretending macintosh is not a piece of crap compared to the Apple IIgs which was deliberately slowed down because Steve Jobs had a petty ego
@Luke He looked visibly uncomfortable to me! Also, let's not get into an argument about subjective versus objective reality here, okay? You're trying to tell me that the software applications made up for it but they could have made the same software, but better (with color) for the Apple IIgs. Ego gets into marketing big time. I know this because there are many business and sales models in operation right now that don't make any sense. Take for example the business of collectable miniatures: here we have a gaming company with a business model created by gamers, not businessmen. Just go check out that crapshow and you'll see what I mean. It's basically gambling.
@@quonomonna8126 Also, let's not forget the price, the Apple IIgs still beat the Mac in many areas even when it was slowed down because of Jobs' insistence on crippling it, it had gs/os, it was in color, it had a faster cpu, and at the time it had more software and it had more games too, the Apple II team took the idea of the Mac and executed it way better. It's been confirmed many times that Steve hated the Apple II because he thought it was older tech and the Mac was the future, so when the Apple II team executed his vision better, he was angry. Even years later, at the introduction of the iMac, Wozniak wanted him to give a shoutout to the Apple II team because they were one of the main reasons that the company was alive at the time, he wouldn't do it.
@@Bsodman I forgot none of those things
@@quonomonna8126 Yes yes, this was mainly aimed at Luke even though I forgot to tag him. It's just sad that so many people forget that what a dirty start the Mac had.
Jobs left Apple a full year before the IIgs came out.
And bear in mind you're kind of conflating two versions of the Apple IIgs here: the one that existed in the real world in 1986 (a year after this episode of Computer Chronicles aired) and sold with a bare minimum of memory, no display, and no disk drives for $1000 - and the imaginary version that lives in the minds of IIgs fans, the version that would have run at 8MHz or something because of that one time Woz said it'd be awesome to have a machine built around an 8MHz '816. The latter does not exist in reality. We don't know what kind of IIgs we could have got, and what that hypothetical machine would have cost, because it was never made. The closest thing we have is IIgs accelerator cards. Get a *really* good one of those, and a IIgs can *almost* keep up with an Amiga 500.
The IIgs that exists in the real world has more wrong with it than a slow CPU. It has a slow *architecture*, because a lot of its architecture is Apple IIe architecture, stuck not only at an 8-bit data bus but also locked to 1MHz. The video RAM is part of that, in IIgs super-hi-res mode you get one frame buffer, which means no page flipping, and it's in slow RAM. Combined with no hardware sprites or background scrolling it really creates a bad situation for games. The best case as I understand it is to double-buffer and use the IIgs's RAM shadowing, triggered with a technique like "PEI slamming" when it's time to update the display RAM - If I got the math right, means you'd spend around 35ms updating the display from your double buffer, so you could maybe run the game at 15fps and only spend about *half* your overall CPU time telling the hardware to update video RAM from the double buffer! Or you can make "Arkanoid" or something where you only have to update smaller areas of the display on each frame.
If Apple management really kneecapped the IIgs, Woz should have signed it "Alan Smithee" instead of "Woz". (Though it is nice the guy got to sign one a machine in the Apple II line) I think its problems run much deeper than CPU speed, it's baked into the whole design. Look at the SNES for instance. It has a faster CPU than the IIgs but not by much (3.58MHz vs. 2.8MHz) but it is *way* better at moving graphics around the screen. That's because, like most good game systems of the era, it's all about the rest of the chipset - hardware support for layered scrolling playfields and sprites makes a *huge* difference.
I agree that the early Macintosh machines were fairly limited. I think those limitations were a necessary compromise at that point in the platform's history. It was slow enough as it was, frankly, and going color would have made it slower - or more expensive. What it offered instead was a high-resolution display (at least, higher than most color displays at the time) with relatively low RAM and CPU requirement. Later, more powerful Macs (1987 onward, Macintosh II line, etc.) were more powerful but with a price to match. Machines like the Amiga, Atari ST, etc. at the time were a good middle-ground, reasonably affordable but reasonably powerful, and that's what the IIgs should have been - but it really fell short of that IMO. We can imagine what if maybe it hadn't... but that is not how it turned out in reality.
'I think, I'll tip via homebanking... 😅'
Oh Steward, what a crazy idea... 🙃
Apple IIe was the computer that pulled me always from the Atari 2600.
23:53 And then ... there was PageMaker.
11:20 Highlighting states that have more than 50 Computer stores.
The times they are changin
Give it 30 years, Mr. Schindler.
Paul Schindler, always missing the point on his comments
André Ferreira total clown.
Practically everything he says is completely wrong, but he's at least more interesting than George Morrow. In Morrow's commentary segments he just says some obvious stuff every time
She just pronounced Mac Write and Mac Paint "McWrite McPaint" like McDonald's....
I'm just glad the operating system was still called "System" then. I don't know if I could handle somebody on Computer Chronicles calling it "May Koss".
@@ChrisTheGregory And let’s not forget the ever-popular “OS Ex.” 🤓
0:57 Always Scares The Shit Out Of Me
Pretty sure that's just a drawing of the Babbage difference engine.
that's the best part
The mathematician daughter of one of Britain's most talked-about poets figured out how to make that difference engine-had it been fully built-grind out "Bernoulli numbers".
22:30 and now we're here in 2024 where "work from home" has become wide spread!
Paul schindler was right here, there is very little mac os use in business these days, even Chrome OS and Linux are in heavier use. ios is different though
OMG, you clearly don’t live in America.
Interesting watching this knowing Joanna is a billionaire now.
she's always crazy intense irl lol
@4:12 literally twice as big as a mouse
Are you really that dumb? Did you fail basic English comprehension?
@@BlownMacTruck I stand by my statement. That is literally double the size of the 'space hungry' mouse lol
@@mcswabin207 The trackball is smaller than the amount of space a mouse would need to operate. Apparently this concept is too much of a stretch for you to understand.
Gee, it had both colors... Black and white... and low end sound. The Atari ST and the Amiga came out later in 85. 4 channel Sound, Fully Pre emptive OS, 4096 colors... yep.. the Amiga was a Dinosaur campared to the Mac.. (Geez the Mac did not have a Fully Pre emptive OS till 2000and what?
the amiga had the same processor and could emulate a Mac in software better than an actual Mac could run
Neither of the Atari ST or Amiga had that desireable Apple fashion-statement logo still so important today ....
Was Paul Schindler ever right about anything?
Once again Paul Schindler getting it completely wrong.
I just stumbled on these episodes last week. Great shows. For that time period.
10-12-20
just now? geez what rock have you been under for the past 35 years? 😒
The Return of the Mack
I'm surprised GEM looked so good in 1985 on the 5170 PC AT. Did it allow multi-tasking? Maybe I should dust off my AT clone, 5154, and try it.
I’ve tried openGEM under FreeDOS, and no it did not. Can’t comment about the OG one
Neither allowed, nor disallowed - it was out of scope. Multi-tasking is something an operation system has to support (or not). GEM by itself was just graphical operating environment and widget toolkit that could be integrated into various products. What we see here are individual DOS apps using GEM for UI: a file manager, a picture editor. However, the TOS operating system of Atari computers used GEM. TOS wasn't initially truly multi-tasking, but its later developments were.
Computer Chronicles got a lot of things right, but one thing they got wrong was how much time they spent talking about Top View and it was ultimately a complete flop.
IBM was a market leader but they had really already stopped innovating in any significant way.
Yep, but you have to remember that IBM totally rewrote the microcomputer world at that moment in time. In just 3 years it was IBM or compatible as a standard. Everybody expected that it would be IBM to define GUI on the PC.
Paul Schindler is so wrong it's hilarious 😂
Joanna Hoffman and Larry Tesler, it a good one!
Hmm, I wonder if the GUI will ever catch on.
the computer operating system you use today is based off of this technology.
@@jessihawkins9116 bruh what? are you taking him seriously?
@@Xenotypic GUI stand for graphical user interface. It is designed to be used with a graphics capable monitor and a mouse. This is what we use today.
@@jessihawkins9116 that's correct! and it's great.
@@jessihawkins9116 You don't pick up on jokes and sarcasm very well, do you?
Anyone know why the program title jumps around on the into frame?
I'm watching this on my M1+ iMac
also you have a spotty oik telling gary kildall about gem, the guy who's company digital research invented!!!!!!!!
Dumb question: Did HyperCard "break" Bill Atkinson-i.e., drive him to ditch the IT game for photography?
I want a Mcpaint with fries.
Odd that technology only seen as hardware.
OMG its really her
@23:55 "it's too expensive"
2025 still too expensive.
The fruit company was able to convince a section of the population that expensive means great product.
watching this as I realize that my software development machine has been a macbook for the last decade.
The Mac of today has nothing in common with the Mac of the 80s. Back then they used different processors, had a completely different system architecture than PCs, now everything is almost entirely the same.
Ugh. I can’t believe someone thought the Chicago font was a good idea.
The irony about this is that the creator of CP/M, a command line operating system, is showing off the computer that killed the command line.
I use CLI in my windows 11 business environment all the time. It’s still way more efficient than using the gui
Killed the command line? Where do you see it dead?
yeah he’s also showing off his own GUI called GEM without revealing a conflict of interest.
Is anyone watching this thinking "god...didn't M1 do this same thing??"
Paul Schindler is wrong; BigMacs aren't that disappointing.
24:00 Paul Schindler - his opinion was valid for few years, but didn't age well..
Computer Jesus in a suite
why cant I get this in 4K guys, sheesh!
It's a TV show! 360p at best.
@@WinrichNaujoks Jokes aren't your thing are they?
@@ChrisMusty ooohhh hahaha!
You can, just use AI to upscale it
nice steve jobs
Jobs took Raskin's design for an affordable computer... and he screwed it up.
Xerox knew a personal computer powerful enough to run a GUI would be hideously expensive at that time... and that's why they didn't market the Alto.
See also: Canon Cat, Xerox PARC
The Canon Cat was the most faithful incarnation of Raskin’s original idea.
Guess what--it didn’t sell...
What have we done with this awesome technology? We've enabled perpetual outrage mobs and #metoo and stupid cat videos!
NOBODY could have foreseen how badly technology would be abused to fuel outrage culture, though a lot of people could see the loss of privacy that was coming.
#metoo is an amazing movement. Don't be a douche.
@@ferrreira I hope she sees this bro
@@tarstarkusz cat videos are never stupid. Or at least they are much much better than stupid vertical videos and tiktok
10MB hard drive for $600, ha !
Sounds like a one in a lifetime deal to me
10mb was a lot of space back then considering the floppies only held 400k back then
Why did Apple want into business instead of gaming? Now Windows is the gaming platform and it responsible for so much more spending than business applications. I drop thousands per year upgrading and buying gaming hardware and software. We only spend a small fraction of that at my business for productivity applications.
that’s a fair question. First, Jobs had used psychedelics and knew that the computer would push the human race forward (his words). He probably spent very little time gaming himself. Second, Jobs left Apple to found NeXT which was a computer for educational institutions (again, to push the human race forwards). Gaming is a timehole and he probably didn’t see it as a life well lived. Third, the Mac did have games. Tetris comes to mind.
@@blackrockcity I reserve gaming for snowy cold days, but really enjoy it. I do tend to think that I could be doing something productive instead and the older I get the less time I spend gaming.
@@kdw75 I played a lot of games when I was a boy. I don’t strongly regret it. But they would feel boring today.
Whoa! kids use to give their teachers a Macintosh ? Mind blown.
he was referring to the McIntosh Apple 🙄
Hugely overpriced and hugely underpowered, as Apple products so often tend to be. I guess it deserves credit for bringing mouse-driven GUIs to the masses? Just one year later the Amiga came out and completely obliterated this thing in every possible regard, while costing a lot less
@25:31
Ah there is an irony in the theology of the Mac compared to what Apple stands for nowadays.
You mean how Apple embraces open source and utilizes it throughout their entire MacOS systems today?
errrrm the mouse was invented by AN ENGLISH ENGINEER IN THE 1960's not apple....... nice to see them stealing ideas right from the start........
Yeeeeah but that mouse had wheels not a ball. I’m pretty glad the Mac had a mouse ball.
@@blackrockcity doesnt matter..
@@chloedevereaux1801they didn’t steal the idea. They paid a guy in Palo Alto to design the first Macintosh mouse if I recall.
Crapple always expensive. Atari or Amiga are better and cheap.
Paul Schindler in this episode is very clownish! He was 100% wrong!
Just because The Mac did not make his "List" he completely pans it.
The Mac was an overpriced Joke in the 80's
No and no. The people who joked about it were quite literally the people who lacked vision. You don’t want to be in that camp in life, trust me.
Gee, look at the map of the united states.... in low rez, black and White. How amazing...
this was back in 1984. It was impressive. Google fan boy 😒
"Apple wants the Macintosh to be taken seriously in the office and I'm sorry but I just don't think that's ever gonna happen". - say again?
Of course looking back more than 30 years later, things are different today, but this show was shot in 1985, and that's the lense it was intended to be looked at through. You *did* hear where he listed the reasons *of the time*, right? Small (9 inch diagonal) black & white screen, too expensive, etc. Add to that that it only had 128KB of RAM, which was small even for its day, no hard drive, only 400 KB floppy disks, etc.
@@SweetBearCub Well, technically the prediction stood its ground. The PC market ended up swallowing up 90% of the marketshare of all computers ever sold. The Mac held a niche market for things, but big corporate companies that started putting a computer on every desk for their employees in the late 80s never went to Apple.
Still too expensive for what you get inside a beautifull outside.
For desktop computers and laptops Macs are still a small part of the market. The install base of windows/linux based PCs is many times the size of the mac base. A major factor is the pricing(You can get a budget computer for as little as $300), but also because people are familiar with using a PC with windows and MS office at work (Macs are mostly used in companies that do professional graphics or audio). And of course the majority of gaming PC's are windows machines because a lot of titles, especially in indi games, are not Mac or Linux compatible (though Linux compatibility is a lot more common now then 10 years ago).
(If you don't believe me check this page on market share of different OSes per month for 2013-2019: www.statista.com/statistics/218089/global-market-share-of-windows-7/ )
The Mac-as-computer has almost always been a home-user or student product, with some small volume niche markets like desktop publishing and video that just aren't necessary in any-way at all to use a Mac now. Plus the last good computer they made was the IIci.
Mac the original windows
But it didn’t run on top of an existing OS in the way that Windows had to run on DOS.
Xerox the original mac
IIRC, Apple nonetheless filed an intellectual-property suit against Microsoft over Windows in the late 1980s.
windows 95 = Macintosh 89
@@JaredConnellXerox, a mac without copy and paste, without draggable windows, without a trash can, and without software. All for 100x the cost.
I still think that Windows is better for productivity. I have a MacBook Pro at work and I don't like it. It's not as intuitive as an OS.
Sounds more like a matter of preference than a real difference in usability, unless there's more to it.
Windows didn’t exist when this show was aired. Microsoft was a Macintosh software company at the time.
Sorry but never liked Apple. Think they are a horrible company.... Microsoft also the same. Linux forever!!!
@maddog187killa Penguin Power!
So none of the companies that keeps linux alive are horrible?
The mac was a waste of time, they would have done better by not restricting the Apple 2 GS which had color capable and backwards compatibility with the apple 2 . Did I mention the color screen? , I contend that if Steve jobs hadn't spayed the capacity in the GS Ram then the GS would have been super super better. The MAC was a downgrade not an upgrade
Backwards compatibility itself kneecapped the IIgs before it even got out of the gate. And Jobs wasn't at Apple when the IIgs came out.
So for instance, IIgs fans always moan about how terrible it is that Apple limited the CPU to 2.8MHz to avoid competing with Mac. But there's another, worse limitation baked into the IIgs architecture: the graphics memory (including the IIgs graphics modes) is part of the 1MHz "compatibility" core of the machine - so the slow CPU isn't even the limiting factor here: as slow as the CPU is, the video RAM is *slower*. In other late 80s platforms like Amiga and Atari ST, the CPU was supplemented by the chipset, which could ease the burden on the CPU. Even if the IIgs CPU had been exactly as fast as the Amiga CPU, the lack of this kind of support hardware meant it was forever limited by that display RAM bottleneck.
I contend that the overall design of the IIgs (resulting primarily from choices made for backward compatibility) made it an underperforming, dead-end design mired in almost decade-old technology. On paper the system could look competitive with other late-80s machines like Amiga and Atari ST but couldn't back that up with real-world performance for a variety of reasons. I really think Woz was wrong on this one: Carrying forward the Apple II design was not a good plan for the future.
But remember also the Mac came out before the IIgs. The original Mac was double the price of a comparable Apple II (A complete Apple IIc, with monochrome display, was $1300 in 1984, while the 1984 models of Macintosh were $2500-$2800) - but it had a faster CPU, wider memory bus, a higher-capacity disk drive (400K on the first Mac models, vs. 140K on the Apple II), and a higher-resolution display (512 x 342 vs. Apple II's 560 x 192 double hi-res, or the more typical 280 x 192 hi-res). The Apple II could display color, but it wasn't really good at it (typically 6 colors at 140 x 192, though some software used a 15 color 140 x 192 mode) so I don't count that as much of an asset. Mac was more expensive but it was a better machine.
Then when the GS came out - shortly afterward the Mac platform got a major upgrade with the Macintosh II (including color graphics!) - but of course that was a much more expensive machine as well, around $5500. The IIgs started at $1000 (approx $1500 for a minimal complete system with one disk drive and a monitor) and the first model was shipped with a bare minimum of RAM (256kB, half of which was "slow RAM" in the 1MHz Apple IIe core) So at that point the price gap had widened to a factor of 4 or so, but Mac was still the better machine in my opinion. (If you wanted a better machine *and* a better price, you could get an Atari ST... Or an Amiga 500 once that came out.)
The Apple II was a great machine for its time (i.e. late 1970s to early 1980s). But its real strengths were in the clever engineering hacks that allowed Apple to make such a great combination of power and economy. But those kinds of hacks don't really translate into a good long-term foundation for a computer platform. As it was, the Apple II (particularly the IIe) carried Apple through the 1980s, long enough for them to turn Macintosh into a platform that could carry them through the 1990s. I think they did about as well as they could have possibly hoped to have done, being one of very few prominent 1980s computer brands to still be relevant in the early 2000s (and the only one of the 1977 "big three" of Commodore, Atari, and Apple to even survive that long) - to say nothing of today, *another* 20 years later! I really doubt they would have done better than that by sticking with Apple II as the foundation for their machines.
you think? 🙄
Steve Jobs left Apple a year before the IIGS came out. Why do people say so much nonsense about Jobs??
@@blackrockcity Also.. his tenure with the Mac made him a defacto king of the Mac camp but he had zero influence outside of that.
STEVE CHEIFET POOPED HIS PANTS