Should we take the word of analog evangelists as gospel? Or are we better off waiting for flying cars? Secure your privacy with Surfshark! Enter coupon code UNDECIDED for an extra 3 months free at surfshark.deals/undecided If you liked this, check out How We Solved The Home Wind Turbine Problem th-cam.com/video/SQKHJm7vd4E/w-d-xo.html
You should do some redeem BrainChip and their Neuromorphic Chip based on Spiking Neural Network. It has just been launched into space with with a satellite called ANT61. It uses micro watts to a few watts of power
(circa 1960) "It would appear that we have reached the limits of what it is possible to achieve with computer technology, although one should be careful with such statements, as they tend to sound pretty silly in 5 years."" - John von Neumann
We just won't be able to brute force it at worse. Still a myriad of technologies to somehow make it faster. Through better architectures of the chip is one of them. Maybe we will figure out how to program quantum to computer to more standard things. As of now they are only better for specific tasks.
The difference here, of course, is that Von Neumann wasn't dealing with the limits of physics, just their current technology. We're making silicon transistors that are approaching the hard limit of their atomic structure. We'll have to figure out a different way to scale computing power outside of shrinking silicon based transistors.
@@jollygoodfellow3957 Same is true for lots of stuff like car engines, we make lots of them and its great initiative to improve them. Make an car engine who is more powerful, use less fuel and pollute much less. we done it. Its just much more complex than an engine 50 years ago. Make it an hybrid to increase all 3 benefits again. And analogue computers are amazing, the Iowa class battleship continued to use their 1940's mechanical analogue computers.until deactivation. They had digital fire control computers for 5" guns but newer one has higher muzzle velocity and the 1980 ones just replicated the mechanical ones cheaper and smaller so you could use them even on towed guns. I imagine modern fire control systems are much more advanced and you have apps for snpers to use :)
@@dianapennepacker6854 i think in far future there will be smaller then atom supercomputers that will work faster then speed of light as they will be inside one atomic particle ;P other then tat i dont see much future for improvements...
The biggest drawback of analogue circuits is that they are quite specific to a problem, so making something that will work for most/generic problems is difficult, where a digital computer is trivially easy in comparison. But, when you need something specific, the analogue computer can be significantly faster and more energy efficient. I look forward to the hybrid components that will be coming out in the future.
Exactly. I don't find small analog components in devices bad, they are useful, cheap to make, etc. Though yeah, you can't have generic computing on anything analog, and really all the progress happened through generic computing. Abandoning digital generic computing would be incredibly stupid. Like, what, we'll get back into mainframes to program something? They are kinda pushing for this with the cloud, seems like they don't want people to have access to computing, they wanna take this away. Returning to an analog world is gonna be really really bad. Not to mention that overly complicated analog circuits are really difficult to design, so if you are say running a digital AI in some datacenter, it'd be a big help for you against others who don't have access. and yeah, they could go further and design analog computers for AI, more efficient, designed by AI, for certain parts of their models. Maybe AI itself will be adjusting knobs for node weight values or connect/disconnect circuitry, who knows. Though taking all those generic computing possibilities away from people, is just wrong and it seems like this is what they are after. I know it will happen eventually, they are anyway taking things away from people, one way or another, including knowledge, science, communications, now generic computing, it's all gonna be taken away eventually. Govs like to take away things, until they realise, those things they take away from their people, eventually it's gonna hit them back when it's too late, and they are gonna be taken away from them too. They might be allowed to just exist, while others will obviously continue their progress, albeit at a slower rate since the majority of people don't participate, but they'll still be moving forward, the rest backwards, you can see where this is going - if it's not happened already.
@@StupidusMaximusTheFirst You reminded me of a useful point: Analogue circuits need to be calibrated regularly because of component aging. This is a problem that was bypassed with digital circuits and a prime reason they are so common now, lower TCO!
@@billmiller4800 I wonder how feasible it would be with some combined analogue-digital circuitry to embed regular self calibration into packages themselves
@@rogerphelps9939 True, in most cases, but the entire point of the video is that there are cases where analogue is much faster and energy efficient, so we should keep an eye open for things that can make our battery powered devices last longer.
I studied analog computers/computing in the 1970s as part of my electrical engineering education. At one time (after that) I worked for a company in Ann Arbor, Michigan that made some of the most powerful analog computers in the world. (I was in marketing by then.) They were used, among other things, to model nuclear reactors and power plants. Incredibly powerful.
They may be powerful, but still lose to digital computers whenever true accuracy is needed. I would not trust any analog system in designing such high risk objects as nuclear plants. Digital computers have already established a plethora of error-correcting mechanisms, many of which only have to jump in a couple of times a week, whereas analog will inherently produce ever bigger errors with every single calculation. It all just depends on what is the tolerance we'll agree to tolerate. I only see analog in really specific applications. Maybe some ASIC chips or contraptions, but the digital is there to stay and govern the analog.
Same. We used them to solve differential equations. I'd been into digital computers for a decade by that point. I thought analog computers were nifty, but at the end of the course, big shrug and move on. It opened my eyes though, and I've never forgotten that.
@@shapelessed but if we look at much of the modern AI they are probabilistic in nature so the use of analog systems in this environment would have a negligible impact on their performance.
@@astr010 "probabilistic in nature" yes, but the probability is based on accurate math. its just "probable" because we either lack the data or the computing power to make it exact. in theory it COULD be exact.
@@shapelessed are they inherently more inaccurate for their purpose though? For example the abacus and slide computers shown in the beginning seem to be fairly accurate and produce reproducible solutions. I’m not sure where the inaccuracy would be introduced in more complex systems and certainly not with *every* calculation.
1:32 No, it may seem like a infinite set, but in practice it's limited by the signal-to-noise ratio, the SNR is efectively the number of "bits" of a analog computer, the rule of thumb is 6db ~ 1 bit, also each component adds its own noise on top of the signal, you lose "bits" as your computation becomes more complex, BTW that is also kinda true on digital computers, if you use floating point numbers you lose some precision with each rounding, however on digital its easier to just use more bits if you need them, on analog decreasing noise is not so trivial
In analog you can also increase the distance from the minimum and the maximum, which does a similar thing, but that is also not so trivial... you would not want to work with 20000 volts or 1000 psi :D Another problem is keeping noise stable in order to have a working calibration. Every physical property change can affect it. You woukd not want it to give a wrong result just because summer is too hot this year.
Precision is not the point here; on the contrary, the goal of analog computer is deal with unprecise values in a more subjective way. It's more close to fuzzy logic than the precise math.
I think the best way to sum it up is: Analog is fast and efficient, but is hard to design (or at least formulate the problem). Once you build it, it is only good at solving that specific problem. Digital on the other hand is much more flexible. You have an instruction set and can solve any problem that you can write an algorithm for using those instructions, so you can solve multiple problems with that same machine. Tradeoff is the mentioned slower speed and efficency (compared to analog). My favourite story about digital vs. analog is the Iowa-class battleships: They were built in the 1940s and were reactivated in the 80s. The fire control computers (electromechanical analog computers using cams, differentials and whatnot) were state of the art back in the day, but given the invention of the transistor and all that since then, the navy did look at upgrading them to digital. What they found is that the digital system did not offer greater accuraccy over the analog. While over 40 years technology advanced quite a bit, the laws of physics remained the same so the old analog computers worked just as well.
Oof. No offense, but as an analog electrical engineer you should have consulted with experts instead of press releases from marketing before making the video. Analog engineering is about tradeoffs, you can minimize power consumption but will lower almost every other figure of merit. You should also at least mention electrical noise, since it is one of the biggest downsides of analog compared to digital circuits.
Artificial intelligence is such a misnomer. Ultimately it’s a bunch of _if then else_ statements and eventually it will run out of stuff. I can’t create a thought. This makes it stupid. Artificial stupidity Why are we wasting our time with artificial stupidity when there is a an abundance of naturally occurring stupidity??
To reduce the influence of noise in analog calculations, you need to use phase encoding instead of pulse (amplitude) bit encoding. With phase encoding, the ALU will operate on one frequency. 1 and 0 will be phase-shifted sinusoidal signals.
I think the point here is that analog design is not anachronistically proposed as an alternative for areas where digital design has already established itself out of several reasons like signal processing where accuracy and error correction are pivotal. Modern analog research rather focuses on young areas like AI where exact processing is much less important than radical parallelization.
I built and used analogue computers (although I didn't call them that) to control the temperatures in a multi-layered cryostat for my Ph.D work in the mid '70s. I did the number crunching on a digital computer that filled the basement of the maths dept building using punch card input. 😮. 20 odd years later I found myself working with an engine management computer for a helicopter that was pure analogue. When I approached the system engineer at a large well known aerospace company who had the design control authority for the system to ask some fundamental questions about it he didn't have a clue - he was purely digital. I'm retired now but if drag my knowledge out of the closet along with my flared jeans and tie-dyed T-shirts perhaps I'll come back into fashion. 😁
It is precisely people like you who need to start channels so that pure digital people can see how you approach a problem and capture your invaluable decades of extremely useful experience and value so the world do not lose it. What a pity if we should have too few people that can help us remain adept at such things and continue to reap its benefits, and to show everyone how the world got here in the first place. I might not be an analog guy, but I would watch your channel any day.
I think automatic transmissions are essentially analog computers? In contrast I think AT is a perfect use case for analog computer where a digital one might be more prone to error?
@@RickBeacham I guess so because it uses hydraulics. And then there's my Alfa Romeo 147 which has a Selespeed gearbox which is an automated manual transmission. It consists of the standard (manual) 6-speed gearbox with the standard clutch and adds an electronically controlled hydraulic (robotic) that actuates both gear and clutch. I guess that makes it a hybrid analogue/digital computer.
Analogue computing is analogous to the P vs NP problem in pure mathematics. It is fantastic at anything which is hard to calculate, but quick to check. I'm this case, anything hard to figure out how to express, but with solidly defined parameters. It works by shunting a good deal of the difficulty solving the problem up front to the designers of the machine. It can take years of incredibly hard work to figure out how to express a single problem in analogue form, but once you DO computing the answer for any combination or variant of the problem is virtually instantaneous.
No it is not. Offset and bias currents together with component tolerances will ensure that any analog computer can be easily outmatched by a digital computer.
I started my career with analogue computers in the 1970s as they were still being used in industrial automation for motor control (PID: Proportional, Integral and Differential). I worked in a repair centre and built some test gear to allow me to calibrate them. It's no surprise to me that they have come back, within certain niche applications they are very powerful, although not particularly programmable, unless you count circuit design as programming :-)
And there's nothing really new under the sun. I studied computer science in the 80s and we learned analog computing in a similar way. The 80s seen operational amplifier as integrated circuit, but that's all. OK, these days there are pretty fast ADCs (e.g. Delta-Sigma), so Hybrid systems could be faster. Main caveat is the mathematical understanding of the problem. No way to start anyhow programming python hoping to solve the problem with tons of iterations. ;) Math is king here.
Gave a talk at DataVersity on context. Explained in the 70's we could build a nuclear power plant instrument panel with gauges of different ranges - this allowed the techs to scan the whole wall and immediately see if anything looked out of normal range (usually straight vertical). However, everyone wanted digital, not realizing that with that change, each number had to be individually read, consciously scaled and then thought about (compare with 'normal'). With digital came the necessity for alarms because it took too much mental effort to scan the wall. Something few consider to this day...
Basic HMI stuff. With an analogue gauge, you get 3 things straight away: 1. The current value. 2. The range of the values (and where the current value sits within that). 3. The rate of change. You have to work that out each time for each digital display. If you want an accurate reading - digital, if you want to monitor a whole lot - analogue.
I use analog displays that are digitally sourced at my job. One nice feature that has been programmed in is that when a limit is exceeded, the color of the display changes. Depending on the severity, the display may be yellow or red. I realize this was readily available in the 1970s though. We started seeing it in our products in the late 1990s/early 2000s.
That's true for some clocks (mechanical with an escapement) but not a 1950's wall clock that runs off a 60 Hz synchronous motor. It's an interesting thought problem though... is an hourglass digital or analog? The answer is digital.
Just like an emulated sine wave is not a true sine wave it’s stepped and smoothed but it will never be a pure true sign wave because switching causes steps Digital systems can approximate analog movements but never emulate all possible steps because it’s literally incalculable because you can never break a analog signal into a digital signal because as accuracy goes up size of each discrete unit goes down to a point where it is operating at a microscopic level
in the case of audio playback, the stairstepping of digital doesn't matter past a certain sample rate as the limited physical movement of your speakers will smoothen them out if your DAC hasn't done so already
I'm skeptical about the broad assertion - 'Why the Future of AI & Computers Will Be Analog' - that analog computing will dominate in the future. Analog computing clearly has its niche, particularly with tasks involving continuous, real-world signals-such as audio and visual processing, or interpreting sensor data-where this technology presents a clear advantage. However, framing this niche strength as 'the future' and implying a universal superiority over digital computing seems a bit overstated to me.
Man i get the clickbait game, but it's nauseatingly naive and insultingly ignorant to assert that any form of analog computing can match or surpass the real teraflops and petaflops companies like OpenAI / Microsoft Azure are doing on the real machines that make LLMs possible and scalable today. Pardon me whilst i vomit.
These are just pointless forecasts and guesses. The journalists are (or act like) literally on drugs! On a hype. You can't take them serious when talking about computer science. They've become the extended arm of the marketing departments and the willing barkers of the pseudo-tech bros. You are right being skeptical:) Just take that sloppy remark about Cryptocurrency or energy consumption at the beginning. Matt didn't even question that some of the use cases are really just circle-wanking and/or money-printing scams, that are very bad for the environment AND for the people. Let's call it A.S.: Artificial Stupidity!:) (In contrast to A.I., a thing that actually exists in the real world, hehehe) Have a good one, Ben!:) Edit: Later in the video, the part about AI (around 15:00 ): "(for an analog computer:) Rather than ones and zeros the data is retained in the form of voltages." ... which is the same in digital systems and the basic principle of DRAM, that has to be refreshed because the voltage of the storage element (the capacitor in the (C)MOS switch) drops because of natural existing leakage. That will be THE SAME for analog computing. This is possibly information from the "Mythic" company, that Matt repeats. So, I doubt that they do not know the basic physical and electronic principles that make their prototypes tick. This is most likely a straight up lie to deceive the public or investors. That's what we're up to ... Nice guys, nice industry ... don't you think?!:)))
Whenever I design electronics, I often use analog preprocessing, since it takes much less energy to amplifie, integrate or filter signals with OP-amps using summing, PT1-, PT2-models, integrators and differentiators instead to using convolutions or FFT to construct FIR-, or IIR-filter, that needs a lot of processing power.
@@rogerphelps9939 Any good engineer knows that the quality of "superior" or "inferior" is multi-dimensional. The writer said that analog used less power, and that is very true most of the time. Analog will be less granular than digital, if accuracy is important. Analog is generally faster, if speed is important. If cost to perform a calculation is important, then digital is a better choice most of the time. What led you to your conclusion of "inferior"?
@@rogerphelps9939 Clearly you are not a well-informed engineer. Superior or inferior all depends on the intended purpose and the parameters of interest. Example parameters might include required accuracy, expected lifetime, running costs, maintainability, energy use, environmental consequences, ease of use, human factors and so on.
Great video, thanks for sharing. The biggest problem with analog computers is there are so few people that know how to work on them. I’m reminded of a hydro electric plant I toured once that had an electro mechanical analog computer that controlled the units. At the time I visited it was already considered to be ancient and they were actively attempting to replace it simply because know body knew how it worked. They only knew how to turn it on, off, and wind the clock spring once a shift the exact number of spins that it needed to keep running. They had been trying to replace it with a new computer but none of the many attempts could match its precision in operating the plant and maintaining proper water flows. They were in constant fear that it might break. I checked back maybe 20 years later to ask how it was and know one working their knew what I was talking about. Sad that it was long forgotten by everyone at the plant. I thought it should have been retired to a museum, and still hope that possibly it was.
@@bzuidgeest perhaps you are right, but some applications can be very specific like the one mentioned above at the power plant. Analog computers can be very power efficient, so very cheap to run. They have their merits
Many people don't know how things work and still be able to use them... such as a car or a kid playing a video game... So the goal for said "analog computing" is to somehow implement them for the masses such that they won't even need to concern themselves with the underlaying tech... My point is that it's not the users that is the problem, but rather it's the implementation of the product or lack of that's the issue.
@@bzuidgeest Isn't that more an issue with implementation? Digital computers have been designed to be general purpose for decades, but the first ones were single purpose machines. Modular Synths and The Analog Thing show how you can alter analog computing to do different things based on how you patch them together although they connect modular functions using wires. It seems that the companies looking into analog computing are trying to do a similar thing on the scale of integrated circuits.
@@Tsudico maybe, but a digital computer does not not have to be rewired and the same block of transistors can do many calculations. Analogue computers, indeed like synths need to be rewired every time. You could add some analogue switching controlled by a digital controller... A hybrid. These things are difficult to define.
When I was in the Navy the ships I was stationed on had a single gun on the bow. This was a 5"/54 and was operated by an analog computer. If you watch the SyFy movie Battleship you can see this type of aiming a large gun round at a given target. One of the big advantages of analog computers is that they are not subject to EMP.
@@esunisen3862 exactly - most electronics will just reboot. If you are under the blast there is enough energy to burn out circuits - even analog. I did EMP Testing on the B1-B in the late 80's.
@@canobenitez How basic is it? Any diesel car/truck/whatever will still get fried, if it's got a digital controls, fried... anything with any electronics will probably just be dead.
It's important to understand in the clock example - that even though the clock hand (second hand) was moving discretely (moving only every 1 second, and not continuously), it still is analog. Because it is not necessarily the continuous nature that makes something analog, it's if the physical process happening is _analogous_ to the real event. The twist of the second-hand is directly proportional to how much of a minute has passed. However, in a digital clock, the voltages in the bits (0-5V) are not analogous or directly proportional to the time that has passed.
6:43 so far I feel like this video isn't about analog vs digital computing and more about mechanical vs digital (not even mechanical vs electronic) ... Both digital and analog computers can be done in both electrical and mechanical ways. Hybrid computers are something that will happen though.
Playgrounds in the 80s were amazing. My favorite was the spinning metal platform of death. That was still safer than sharing a seesaw with a betrayer, that thing was coming for your chin or they tried to smash your spine on the ground. Good way to learn which kids couldn't be trusted. Flying off the swings to land onto gravel and dirt and get cool scratches. I was always bleeding a little bit as a kid, a result of good times :)
Your reference to Amdahl's law seemed wildly out of place. It deals with how parts of some problems cannot be done in parallel and the fastest and problem can be solved will be as fast as the slowest serialized work. This holds for digital or analog computation. Amdahl's law deals with adding more computers (cores in the case of modern digital CPU design) to solve. But adding more analog computers (or any parts performing calculation at the same time) to a system would still have slowdowns for work that needs to be serialized. Consider MNIAC and how it deals with water flows. if some part of it needs a flow that last so many seconds it doesn't many how many other flows the creator managed to do in parallel the fast runtime will be at least 3 seconds. To continue this terms like "critical section" have specific definitions in computer science that simply aren't applicable here, they deal with synchronizing the work of multiple threads and cores and would have analogous design considerations in analog machines that have multiple parts contributing to the same computation.
As one who has programed and used analog computers in the past, I can only say is "I told you so." They are the most powerful computers in the world. It took a while, but I knew they would be back. Welcome back my old friends. Do great work.
9:52 that's not truly an analog clock either, it only shows the time every second. Like as in the temperature example at the start, if it was analog it should be able to show fractions of a second too and change continously, but it only does in 'bits' (seconds).
I think analog has a great future in large language models and other AI, which is fortunate because generative AI is currently the fastest growing use of digital computation and the energy it requires. The evidence for this is that LLMs continue to function even when the weights and computations are quantized down to, say, 4 bits of precision for models designed to run on small systems. Four bits corresponds to a need for voltage accuracy of only 6.25% in an analog implementation, which should be easy to achieve. I don’t know how easy it would be to implement algorithms like the back propagation, which is used in training, in an analog form, but my guess is that it shouldn’t be difficult.
in future ai has to reduce its main code and simplify it :D otherwise it will start to become confused then start acting like old people under influence of demantia .. just a guess :D if there is any code atleast :P as llm is a complex structure.. and there is no actual code in lots of models but instead database used as some kind of script which is still code but logic meanings cant be converted to machine code directly... well thats what llms are indirectly designed to do :D os may be they can translate that to standard code themselves before it becomes impossible to translate/simplify :D
Sadly, that's exactly where analogue processors fail. Analogue circuits are way less scalable than transistors, shrinking an analogue chip is incredibly difficult (see Apples modem division). Also, for Ai, as it is at its core a many times over repeating number of operations, its sensitive to consecutive errors, as introduced by an analogue chip. Mythic, as far as I know, gave up on their big AI dreams and is focusing on smaller applications, some still AI adjacent, not as impacted by the problems of analogue error. Analogue has its place, but general analogue computing is very far away if ever reached, Asics are the digital answer to analogue, still digital with their benefits in scaling, without the harrowing complexity of analogue and benefits of fixed function efficiency. Analogue in chip design is the dream, but as the laws of scale punish the transistor, they hit analogue many times over, while the math of chain rule pummels it to the ground in most cases.
Digtalk still wins. Nlo need to worry about bias currents and offsets or component tolerances, all of which scupper anything but the most trivial of applications.
@@darlokt51 I see your point that backprop may not be able to be quantized down to as little as 4 bits because of the chain rule. But the real reason I'm back to this thread is to flag a public lecture I just watched by Geoffrey Hinton (who anyone involved in AI will immediately recognize) posted a month ago on the University of Oxford TH-cam channel. Starting at minute 27 he talks about digital v analog implementations for almost ten minutes - his conclusion being that digital is better for future AI despite the potential advantages of analog.
Although I'm now retired, I well remember studying analog computers in the 70s and having to perform the math in order to correctly program the system. It is true that the accuracy of an analog computer is far greater than digital, chip/computer designers elected to choose speed and convenience over accuracy. It is only today that society is beginning to appreciate the advantages of analog systems, a great example is the rediscovery of music from turntables and albums. There are many aficionados and music lovers, like me, who still enjoy the sound of an analog album far more than its digital counterpart, regardless of the digital format used to playback the digitized performance. I'm typing this on a regular laptop, and I recognize the convenience of it, I'm just unconvinced that its the be all and end all. I believe there is a huge future market for analog computers/circuits that has been awaiting the need for greater accuracy and dramatically reduced power availability, measuring the health of bridges comes to mind; thousands of sensors reporting on stresses within individual components, all done in real time". At any rate, thanks for this insightful video, as always it was thought provoking and time well spent!
Analog computing certainly has its uses, but not even hybrid devices are going to replace the main use cases of supercomputers. For one, measurement is as precise as the measurement device, which is basically always discreet. For example, if it can measure mass in nanograms, then any difference smaller than that cannot be measured. However, more importantly, analog computers are task-specific machines. Making task-specific digital chips has historically been incredibly expensive, but that's changing now, with Intel, TSMC, Tower and others making custom chips. Custom digital chips built to solve specific problems can also save 95% and more of the energy compared to general-purpose digital computers - it depends on the problem.
Matt, you should have started with showing an op-amp differentiator or integrator circuit in operation to show that analog circuits do not "compute" they simply operate. Sure, most people are not electrical engineers, but that would show people what you're referring to. As your video is now, I'm sure that most people can't conceive of what an analog computer is comprised of.
There is one new area in which analog computing could help us: the evaluation of neural networks. Because neural networks are fundamentally analog, and are currently simulated digitally. It would be quite feasible to put a tens of millions of analog electronic neurons on a chip.
One of the biggest hurdles to analog ASICs has been the extreme difficulty of designing the IC circuitry. In fact, the most expensive and labor intensive part of a processor has always been the analog circuitry to regulate and supply power, among other things. With the help of AI we’re going to see a massive explosion in this field, since it can easily pull from natural laws, fab constraints and desired outcomes to take much of the guesswork and iteration out of analog ASIC engineering. Literally machines building the next generation of machines.
Analog computers rely on physical properties like voltage or pressure to represent data. These physical properties can be influenced by environmental factors like temperature or noise, leading to errors in calculations. Since analog signals are continuous, analog computers can't achieve the same level of precision and accuracy as digital computers, and we need that for medical purposes. this is why tech scientists are still going the digital way, but this time we are trying to work with photons, which is faster because it uses light instead of electrons, making heat and energy less...and of course making transistors smaller and smaller cannot go on forever, that's also why companies like ASML and intel are instead making transistors smaller, they are making transistors more complex, like patterning.
Good comment! I'd add that there is a classification of analog called switched capacitor circuits which are not continuous. They're typically used in adcs these days but were all the rage for a while.
Large analogue system's will probably be better at predicting / controlling chaotic events. The whole point of "expert systems" is to digitize more analogue like actions and observations.
While working for Texas Instruments, I spent six months learning about the SEL 32/55 hybrid (analog and digital) computer. Ultimately, I worked on the digital side as a systems programmer. I left TI when they planned on transferring me to work on the Ada compiler. I wanted to be a VM/370 (BSEPP, SEPP, SP, etc.) systems administrator/programmer. I took a job as the lead systems programmer at the Owens Corning Fiberglas Research Center in Newark, Ohio. I loved the job and working with the IBM user group SHARE.
Like, the first 75% of this video talk about what analog computers are not. Add some snappy comments and twist, and you still don't come close to what is mentioned in the video headline
Excellent presentation. Thank you very much. As one who grew up with slide rules and Heathkits which I still have and use, it is a joy to follow your explanations. I applaud your use of the more precise and historic philosophical description of the distinction: discrete vs. continuous. Digital is really not exactly the same concept as discrete. In the 1950s as I was growing up and cutting my teeth on electronics courtesy of the US government selling complex devices that originally cost thousands of dollars for a quarter a pound, I also enjoyed building lots and lots of Heathkits. These collections of discrete (there's that word again) components and excellent manuals educated post war generations in the practical fundamentals of electronics and saved many dollars over factory assembled gear. I coveted the analog computer kits Heathkit marketed, but I never built one. I was thrilled to see The Computer History Museum in Mountain View, CA has one on display. It was all I could do to stop myself from twirling it's knobs and dreaming of the warm glow of its hot electron field effect devices, differential amplifiers, patch cords, and more. One observation on your presentation: You refer to a slide rule as an analog device, which it is. But in the same breath you declare an abacus also an analog device. While I'm not a skilled abacus user, I think the positions of the device's beads are always assigned to specific locations of clearly limited number; therefore calculating with an abacus is a discrete (though not digital) technique.
The large hall-sized computers you show at the start of the video are actually digital computers. They just used vacuum tubes instead of transistors so took up a lot of space and used a lot of electricity.
In 1971 analog computing was one of the engineering classes I was required to take to get an engineering degree. Generally the process was similar to what you show at 10:27 in your video. It was very helpful to use when looking at dynamic systems. You programed the system by turning dials and moving jumper wires so there was no easy way to store you program and reprogram the system. To create a practical analog computer you would need to develop digitally controllable capacitors, resistors and transistors.
You mentioned using analog computers to calculate heat flow in objects. I remember using nonlinear partial differential equations to model heat flow in a metal bar from a point source. Bessel functions were used in the process of calculating the heat flow.
I personally view analog computing as simply "unprotected digital computing". Correct me if I'm wrong. Digital computing is essentially gating electricity such that you can move electricity along logic corridors to create predictable outcomes by compressing a whole range of analog signals into 2 (zero and one). The compression is an engineering solution that allows us to concern ourselves much less with signal clarity and correction, and deal instead in these coarse bits. Of course, the compression and gating also is a huge waste of the actual electrical signal. The analogy is like trying to watch TV through a white blanket sheet. Instead of the high fidelity of the electrical signal itself, you get 0 and 1. Taking off the protective layer provides us with a ton of untapped signal to play with... but it also comes with its own share of hairy issues. Noise and such. Signal correction?
Don't know, just commenting to boost your comment so someone smart notices it and has a chat with you about it so I can glean so dumb-people nuggets from the chat :)
Some semiconductor engineer (dont rememeber his name) said that the advances in semiconductor technology can be boiled down to discoveing new techniques of making the circuits run at higher frequencies while preventing them from falling back to the analog domain
Not really. The difference between digital and analog in terms of computing is that in digital computers you use symbolic representations composed of discrete bits, transformed during discrete clock cycles (which in math kind of maps to the realm of integer, sequences and so on), while in theory an analogue system express values in a full continuous spectrum and operate on them in a continuous way (the real number line, differential equations, and so on). What you're talking about is that we implement digital logic using what is an analogue system (voltages that gate transistors), so we discretion values (binary, trynary) and make multiple parallel channels (bits) and clock the system. You could however implement a digital computer in a system that has finite discrete states, it's just that transistors are very fast and cheap and scalable and actually do not consume any power when fully on or off (which is the ideal state of a digital computer). In general digital gives you the best precision but also the highest versatility, because symbolic calculations can represent anything you want given a certain precision. Analog system need to be tailored to every specific problem by rewiring their underlying system to represent the problem (or just be a fixed accelerator), and while they have advantages in calculating certain problems (that describe physical systems) efficiently they are not efficient in other problems (the ones that map to discrete quantities). In fact while we know that every Turing complete digital system is equivalent, there have been also studies that have shown that you can map any analogue problem to a digital version and vice versa....it just says nothing about the complexity of the transformed calculation. TLDR: you're confusing a bit the physical layer with the conceptual layer, but indeed since voltages are an analogue quantity when you restrict them to represent discrete values you lose something the analogue system could do very naturally and efficiently, but this is a quirk of the specific system
@@fluffy_tail4365 The physical reworking issue got me wondering about modular analog systems that can be quickly transformed into different forms. Seems like a digital system could manage the transformation of analog systems into whatever form suited the need. I am obviously out of my depth here, just an idea.
@@fluffy_tail4365 Like the way Alphafold figured out protein folding. A digital system can be used to alter a physical form of coding information that does stuff in the real world.
Be very careful with the desuperheater on your home HVAC system. These systems use a simple heat exchanger to transfer the heat from the heat pump working fluid to the domestic hot water, but the catch is that this happens at high temperatures (duh). If you have hard water (many, if not most, municipal water supplies are hard water to varying degrees), the heat exchanger will quickly cause the minerals to plate out of the water onto the heat exchanger surfaces. The result is that the heat exchanger becomes less efficient over time and will eventually clog up completely. To combat this, you would need to add a water softener with a high enough capacity to feed the hot water supply. Such a water softener is very expensive to buy and maintain. The alternative is to have your heat exchanger cleaned out every year, which is time-consuming and messy. To be fair, electric and gas water heaters also have this problem with their heating elements, but electric hot water heaters have cheap and easily replaceable heating elements, and gas tanks typically directly heat the the outside of the tank, so there is nothing that can really clog up.
a general purpose, multi-function device is great, but it'll never be as good at a specific task as a device built to do that ONE task as efficiently and effectively as possible. As for analog physics, think of what's faster, pushing a button that activates an actuator that pushes a ball, or just pushing the ball yourself?
A good complementary development to the modern analog computer is the ongoing "second age of microcomputers", which has had a watershed moment lately through gradual consumerization in custom PCB design, low-cost microcontrollers and FPGAs, making it possible for hobbyists to manufacture dream machines deviating from the mainstream PC architecture. There is even a project called "Tiny Tapeout" that allows you to publish an ASIC design and get it anthologized with hundreds of other hobby designs on one chip. There's a whole menagerie of retrocomputer-styled designs being made using these components, from the little ESP32-VGA, Agon Light and Neo6502 to more ambitious FPGA boards like the C256 Foenix and Commander X16. At first glance, they're nostalgic toys, but they signal a further decoupling of personal computing from industrial applications, an important counterbalance in an era where everyone complains about how much our devices and data are not really under our control.
I invented an analogue computer myself, back in the 70's. I proposed using the pulse frequency of a rectangular (digital) waveform as the physical variable. Summation would correspond to integration, driving a pulse generator. Differentiation would be via subtraction, again driving the pulse frequency. I'm not sure whether it would work or not, but it could be very useful if it does.
Oh yeah totally, differentiating the pulse, I totally understand what that means and can see the potential ;) Smart people like you confuse me, how are we the same species? :)
@@GordonSeal I tried, in Python, but I am not convinced that I got the model correct. It needs to be asynchronous, but wasn't. I'm unable to do much now. I also suspect that A/D and D/A components will be required for the pulse generator interface, to ensure the asynchronous nature of the entire system. After further thought, the best approach may to simply to charge or discharge a capacitor via the pulse charges, to control a pulse generator. That may actually work, but may not offer any advantages.
Haveing used analogue computing in the 70's I can say they are real pain to use for any "quick" investigation . You are allways hitting over voltage saturation or the signal drops so low you hit thermal noise.. To get them to work is a lot of fiddling.. This is just a case of the grass looking greener over the fence. I will stick with digital proccessing with a anolgue front end to get the signals in a form the digital system can handle well . ( best of both worlds).
So how does this analog revival mesh with the emergence of quantum computing? I'd love to see a follow on episode talking about this. Love your content!
I've often felt that Quantum Computing was just a really fancy method of performing analog computations. Yes, you get more direct control over effects like quantum entanglement, but that also comes with the downside of making the computer hard to scale. I expect we'll have powerful analog components in wide usage long before the GA quantum computer is cracked.
Isn't an abacus a manually operated digital computer? The beads are moved by hand but they still represent an on or off state. The slide rule is an analog manually operated computer. I used them a lot before we got digital calculators. When I was in the Air Force we used analog computers (large ones) and they were fantastic machines.
I was most fascinated by the analog chips ability to operate really effective & efficient image recognition software. *BTW did u hear about the lost Disney sodium vapor method? How it's been rediscovered. The Corridor Crew YT channel just came out with a video where they collabed with a guy who figured out how to do this lost art in filming and media. It's actually a game changing aspect. I recommend looking into it. It gives us the ability to not have to stress and mess with all the green screen effects just to still have those look off.. I had no idea how fascinating this lost art was and the effects it had on some amazing classic movies that had such a great unique look that modern movies just don't produce but I'm really hoping we this this method skyrocket in use thru out the film industry. As well as a return of 2D Animation and other forms of creative media.
I watched that over the weekend. It was a great breakdown. Its crazy how simple the tech was back then and how more efficient it is in comparison to todays tech.
I hear that sodium vapor has its own unique issues like requiring a very controlled environment (which you can get a sense of in the CC video). It may be better to use the knowledge of how sodium vapor achieved its effect while using a different frequency that may not have the same issue although there are limits to which frequency of light we can choose. I hear IR doesn't work due to objects giving off heat and I'm not sure it would be safe for anyone to stand in front of UV lighting for hours on end although it may allow using sun lit environments.
@@Tsudico They could probably recreate the effect with LED's programmed to output the color at that wavelength. Add in some basic software and light sensors to adjust the light to the room, and bam there's your solution.
@@ML-dk7bf One of the benefits of using other options like green/blue screen is that it can be used in natural light or outdoor environments. From what I've seen that doesn't work with sodium vapor which was what I was indicating. There might be alternative wavelengths of light that occur in natural light but that aren't used for film that could provide the benefits of the sodium vapor process in a greater variety of lighting conditions but only testing would know. Otherwise the sodium vapor technique, no matter which wavelength of light, just might be limited to specific use cases.
It is a truism, "you can't keep reducing the size of components forever." ... however, with the ever increasing speeds, I suspect you can upgrade the materials for those components... rather than silicon (semi conductor), use silver at the speed and temperature that it acts like a semiconductor... Or switch to magnetic processing to spin electrons around the component rather than through them. Also, a conformal coating of diamond (vapor deposit) would go a long way to ejecting unwanted heat.
@Matt, very good video. FYI: Standard digital microprocessors actually operate in an analog world. The only real differences are comparators that separate signals into high and low voltages(1's and 0's) and storage as a high or low voltage. We assign meaning to patterns of these stored voltages like the ASCII table.
This sounds analogous to a CD or DVD being digital, with each pit being open or solid (reflective), but each pit is actually a physical area with a hole either blasted out by a laser or not. That hole (or solid place) is quite clearly analog. It can be measured, and their size and placement varies by nanometers, but the digitization of the reading laser changes all those 0.000002, 0.000034, 0.000001 reflectivity measurements to 0, and the .9999997, .999999, and 1.00.000002 measurements to 1. I guess they never get far at all from the 0 or 1. There will never be a .1 or .9, much less a .5, but with analog, the data/signal can be anywhere between 0 and 1, and it still gets processed by the next operator in the chain.
I think as part of the digital computers portion, the sequential bit CAN be eliminated if needed. We actually do this in a number of cases, and it's called hardware acceleration. It requires making a physical piece of hardware that can do the calculation in one operation that would normally take multiple steps. For analogue computing this hardware acceleration is in many cases a mandatory step.
I don't mind the idea of combine the best of digital and analog computing but it feels a bit short sighted. The furture isn't either but bio-chemical computing where we don't use 1 and 0 but a breath of options in the same way our brains work.
Some things to keep in mind ... an individual neuron can operate up to ~1Khz and most systems (muscles, heart, etc.) behave as chaotic systems. Side effects can be reaction time, massively parallel, synchronization, de-synch'g. Keep in mind it takes years and decades for learning how to use our brain.
I wonder a lot about the potential of photonic computing. Each frequency of light is its own data stream, so one beam can hold many streams, super cool. I suspect you are correct in thinking the overall structure of computing has to change. Just a gut feeling, I have no expertise in this stuff. Copying living systems sounds chaotic, harder to make specific and purposeful. I'm seeing crystals and light computers, take the idea of structure to higher levels. No reason we couldn't fuse photonic and biochemical computing together, perhaps like some sort of future copy of the brain hemispheres. Seems like we want to create gods to rule over us. Makes sense, our entire history is filled with myths of beyond-human powers, we're just fulfilling the prophecies we wrote. After we experience real gods, we'll probably get over the whole idea and wish we could go back to them being imaginary.
This video reads like so many other speculative science printed media articles one used to read in pop science magazines and the like - "one day we might have solar cell collector arrays orbiting around the Earth, beaming down electrical power through invisible streams of microwaves", like. Of course nothing ever came of any of that. Limitations with digital computers aren't necessary actual limitations, but rather limitations in ourselves that we need to get over. 8-bit digital computers CAN process larger numbers than 256 discrete states, you do it in software, paying the price in speed of calculations and storage space needed for the code and temporary data. Amdahl's law is only a limit if you can't change the nature of the problem you're attacking to go around it. Replace the algorithm to include fewer or no sequential processing steps, or subdivide the problem into smaller chunks that each can be parcelled out to individual CPU cores - things of that sort. We didn't get to where we are today by only seeing problems, instead of potential solutions. We have digital computers because they're flexible, reliable and anywhere from reasonably to extremely powerful. Analog machines, well, not so much really. An analog computer built to do one thing can't do another (like, forget about adding new features through software patches - all the "patching" you'll be doing will be hardware), and they don't return precisely the same answer every time. Digital computers have margins of error as well, but their errors are dependable, predictable. Analog computer errors also would have the potential of changing and growing over time as the machine is used, due to wear and tear. Will there be no room for analog computers in the years ahead? It would be silly and irresponsible to completely dismiss them (one just has to remember that famous IBM boss back in the 1950s who "predicted" six computers or whatever it was would be sufficient to cover the needs of the entire world), but replacing digital computers in general/on a large scale? No way, forget it.
Honestly, the idea of marrying Analog sensors, that can detect change without need of direct constant power, and a Digital always on Hub of information, just made me think were not far from cyborg / android bodies. The digital mind, requiring the more immediate power source (similar to a regular brain) being able to read feedback from sensor points, that dont require as much power until triggers occur that activate them. Maybe Im thinking too far out on it, but this seems closer than it has in the past. If Analog memory can be created, where previous outputs can be retained at significantly lower energy retention costs, rather than constantly having to generate new ones, then yeah, I can see prior reference-able memory becoming a real thing, and maybe then Assembly Intelligence (what i call AI lately) can start to store past "experiences", and then were not far off from actual AI (Artificial Intelligence) Interesting but Chilling Stuff
I started working with ML applications on microcontrollers about 5 years ago. Basically, once the algorithm was determined it was just a long polynomial. I came to the realization that it could be done with an analog circuit with basically no digital computation delay. So i think that analog has an opportunity. Back in the 40s, 50s and 60s, rocket guidance systems were analog computers. Lates 60s the Apollo guidance computer was the first digital that could have specific inputs for configuration changes.
I literally see zero serious application of analogue computing in anything that requires actual computing. You can make a dedicated chip on motherboard for time, dade, calculator and other pretty irrelevant stuff, but for calculations such as ging, encoding or encryption you can never beat digital, regardless of the energy savings and other irrelevant "arguments".
I will be forever grateful to you, you changed my entire life and I will continue to preach on your behalf for the whole world to hear you saved me from huge financial debt with just a small investment, thank you Stacey Macken.
Wow. I'm a bit perplexed seeing her been mentioned here also Didn't know she has been good to so many people too this is wonderful, I'm in my fifth trade with her and it has been super
I realize that trading/ investment comes with a lot of benefits than it seems to be on the news. As a beginner I was scared of loosing my savings but I'm glad I took the bold step that is now favoring me.
Truly, investing has changed my perspective on how one can succeed in life; working multiple jobs isn't the optimal way to attain financial freedom and unfortunately, we discover this later in life. Currently earn as much as 12 grand weekly and this has improved my financial life. Great piece!
Neuronal connections are a combination of Analog and Digital to give some context to your video. Various inputs on a neuron from surrounding neurons in Analog signal. Finally and summated output from the neuron is digital as it is either on or off, called action potential. So we need a combination of both in my opinion.
I'm not convinced. I mean right now AMD can show a LOT of AI processing power (AI cores) INTO a CPU, not on a big fat analog chip. And this has always been a thing in electronics, how you can simulate analog with digital. Nvidia dominates this world because AI is a lot more than just hardware, and no one is touching Nvidia THIS decade. You can't just have a piece of hardware that works well, you have to have SOLUTIONS to customer problems and no one is better at this than Nvidia, they have the best software engineers working these problems and have already solved some amazing issues. Like, find out how long it takes TSMC to make a mask, which is used to make an IC using EUV lithography and these masks have to be perfect. It's many servers running for a few days. Nvidia has gotten this down to a single server operation in less than a day. Do you know how much energy savings that is. Do you know how critical that is for TSMC to have that capability? They can charge customers a lot less money over time so if a customer has to change a circuit it's not so painful. So THAT is what AI is, it's not a chip.
Yes indeed i remember doing analog computers at some university classes. The basic idea was to translate a system transfer function(math equation that defined its behavior) to an analog circuit design that has the same one but instead of having a physical input like force, speed you need to supply voltage, amps, and sinewaves. The result was an output that represented the real physical response.
Analog computing is huge in computational neuroscience. A whole subfield, neuromorphic computing, is dedicated to predicting how neurons behave using analog computers. The way neurons fire, have ion channels open, and signals propagate is all governed by differential equations, so doing that on a chip is much more efficient and quicker.
They are not governed by differential equations, they can be described by differential equations. Any system that changes can be modeled with differential equations.
There is so much analogue in industy. Instumentation devices send 4-20mA signals and control valves are controlled by 4-20mA outputs sent by PLCs. I'm sure the cooling on data centres is controlled by analogue valves and PID. Most electrical networks still use analogue measuring transformers and generator excitation and other complex feedbacks use PID. Full digitalisation has only occured in cyber and the internet.
Por lo que entiendo, la mayoría de las industrias usan PLC cuyos lenguajes de programación son discretos, como Ladder. Lo único analógico son las transmisiones en ciertos casos
Differential equations is just algebra with some calculus thrown into the mix. You get to solve these types of equations a lot when designing the control systems for robots. Spring equations are especially annoying to solve because they oscillate and you need to predict that oscillation or your robot wont be able take two steps without falling over.
Hi Matt. I asked myself the same Analog or Digital question when I met with the Boeing Engineers in 1967. I had been hired and to go to USA because they believed me to be the guy with answers. They showed me their best analogue computer and then the fastest scientific digital computer available at that time (It was an IBM 360/65). I wrote a few algebraic/calculus equations (using Fourier analysis) based on the physics of the natural world plus a few mathematical modelling scenarios. I tried to run them on the analogue system first because the natural world we live in is not black or white with on or off. Our world is naturally a continuous analogue with a morphing of possibilities like Transitions in a Graphic as you shade from one color to another. It turned out that one set of equations generated an even more complex set of equations that were beyond the understanding of even the best theoretical Mathematicians. When I used Fortran (and Matrix theory like Upper Heisenberg) on a Digital machine I could at least see a way forward to modelling the actual physical world even if it was an approximation. By using "Quad precision/4 Word" and 32 decimal point floating point variables I could get close enough for the Boeing aircraft designers and Space scientists to get the answers they needed. Digital Computing is an approximation and faster computing has meant we can get some kind of results faster. However they are not a true representation of physical reality and especially of how the neural networks within our Human minds work to solve physical problems. So to come back to the question Analogue or Digital the answer is eventually Analogue but certainly not with the current elemental and chip based technology. The time it will will happen is when we develop "organic computing". There are some early signs of research into the cellular structure of Neurons and associated storage of information in simulated brain cells and memory. But we are at the early stages and I am personally doubtful whether it is a scientific journey we should take at this time of such cultural ignorance. So as you ponder what is AI you should perhaps consider the nature of human intelligence at all and the glory of what it is to have a mind generations ahead and already infinitely more capable than any we are likely to fabricate in this so called "Modern world". Party on.
My first contacts with electronic analog computers were the Bosch fuel injection controls, first used by Volkswagen and Volvo, starting in 1967. Some of those cars are still in use. The computers worked pretty well. When they didn't, though, only dealers had ways to test and verify that the computer was/wasn't the issue. Woe visited the owner who owned a fried one. Careful jumper-cable use was an absolute must.
Analog makes the most sense for this sort of problem. To me, it's obvious. The way AI works is mixing values in simple ways that can all be done with analog, and you do not need decimal representations in the middle at all. You could have real time responses of unbelievable complexity, because you don't need to wait for clock cycles, just wave propagation, which is as fast as you could physically achieve
analog computers would have to be rebuilt for each different problem. they can't be reprogrammed they must be physically reconfigured. i don't see a way around that.
The author said a new problem is solved by the differential equation which expresses it. So each new problem only requires a new differential equation. So that is a wrong statement on his part?
As undergrad student in 1968 in Sydney Uni I saw demo of analog computing. Target system was a pendulum, which was modeled in a hand built electrical analog computer. Output was a voltage. It was configured to run on same time scale as a physical pendulum. Both pendulum and very large voltmeter were set up side by side, and pendulum and analog computer set running at same time. Pendulum and voltmeter needle mirrored each other for perhaps a minute and we’re still doing so when lecturer shut them down.
Die deutsche Übersetzung ist zauberhaft. Mit leicht amerikanischen Akzent. Aber sehr gut zu verstehen! The German translation is magical. With a slight American accent. But very easy to understand!
when I've spoken about the physical limits in electronics chips in the 1999-2000 and we will surely looking for back with a mix of numerical & analog chips especially with the widest ''worlds'' of Artificial Intelligence and CryptoSystem ... -(Most were laughing, a very few of us unhappy to don't be in this new area of sciences & technologies)- ... And all the computing system will be reinvented for a new ways of programmation, coding & making softwares... As in computing buildings such as supercomputers to mainframes, as mini to micro computers and transputers (computer system with parallel CPU, that I discovered and loved to evolve in for the futur in Spatial universe -astrophysics & astronautics- in Algeria )... It's wrong to think this will stop the evolution of the numerical systems in hardwares and softwares Here we are now with the rethinking in as in quantum computing, the new discoveries with James WEBB telescope -that destroys a great number of theories and know-ledges -such, time evolution, limits and scale of the Universe, etc.-... . Good Luck for the new generation of researchers, discoverers, scientists & mathematicians!!!
I was waiting for you to talk about near lossless data storage for music & photography because of the recent analog chip breakthroughs. Storing analog on a chip will be a real game changer in those fields. Being able to record & copy perfect analog signals will make vinyl records, tape, & film redundant. That's going to have such a huge impact.
In audio field, you have just reinvented BBD (Bucket Brigade Delay). Having a non-mechanical memory still requires sampling and thus anti-aliasing filters. Even if audio quality is improved beyond what BBD offered, I don't see any quality benefit beyond digital storage. Digital audio is transparent already. Quantization you want to eliminate doesn't degrade quality, thanks to dithering. It may however be useful for more efficient near-lossless (not truly lossless because you can't make perfect copies in analog) storage.
If I may ask a follow up question: AFAIK a main constraint with analog computers is the lack of error correction capability, now in the context of ML I can imagine that since the answer is usually some sort of probability distribution or some statistical estimate, one could forgive some numerical errors, but is there an actual estimate or proof that this is the case? Else how are analog enthusiasts planning to deal with this?
Can anyone recommend channel(s) that cover similar topics with greater accuracy/expertise? Matt covers interesting topics, but I notice significant errors anytime he covers subjects that I truly understand, and I guess it’s just not reasonable to expect accuracy in extremely complex subjects from anyone who lacks a PhD in said subjects.
Might want to lookup advances in neuromorphic computing. Mixed-mode analog digital mimicking the human brain to solve problems. AI opportunities without massive data centers and huge memory requirements.
If mythic can deliver the same compute power of the gpu then why dose it not need a pcie slot? All of the pictures showed had connections with far fewer pins than a modern gpu, so I am wondering how they plan on getting the band width?
Saw a video about a battleship targeting computer from WWII. You’d set the distance, wind speed, etc. It would also consider the speed and tilt of the ship. It was 100% mechanical with all kinds of cams and linkages to calculate the proper firing solution.
Dang, what a neat topic! This is my favorite video of yours I've seen in a while. I'm excited to see what components of compute are better suited to analog hadware. The power savings sure are enticing for applications where one thing needs to do a very specific job for a long time.
This material deserves a previous one just explaining key concepts involving the final message. It is not easy to capture the essence of analog computers. Anyway, today I discovered your site. Good insightful material thanks for sharing this.
So which ascii key-combo do we press for the "delta" symbol? We're all going to need that for applying vectored deltas through logarithmic analysis and output
Before Digital Signal Processing was a thing, equalizing the frequency response of a room or car cabin was done with adjusting the gains/cuts of analog circuits called bandpass filters. That was the technology used for decades before adjusting numbers in Fourier algorithms in code uploaded to a chip to change the sound of your audio source, making the room or car sound more pleasant and balanced across the audio spectrum.
And in the mean time high speed digital hardware design is already very analog heavy. Balanced signalling, impedance matching, traces becoming antennas..
There are tremendous challenges with analog computers in two directions. First is noise and therefore challenges in scaling down the size of components. That means relatively analog computers are very large compared to digital ones for the same compute power. The point of solving things quickly is that the analog computer solves a specific problem. Our digital computers solve a large range of problems. Anyone who has worked in analog computing (and they were standard in control theory classes back in the day) know that they have to be reconfigured and scaled for each problem. So, yes theoretically they can take less energy than digital computers. So, can asynchronous computers (ones that don't run on a master clock). The problem is that they have other limitations. A potentially better use would be as a co-processor. Where a specific computation is managed by an analog engine under the control of a digital computer.
Imagine an analog computer AI assistant that was nothing but an analog microphone, an analog computer, and an analog speaker, but you could hold a conversation with it just like ChatGPT. That would be crazy.
I'm curious what the Rare Earth demands of Analog computers is vs their Digital counterparts? Though any reduction in Energy use is a net reduction in Rare Earth use because it means fewer solar panels are required.
Ah, analog computers are great new offer to save mankind from those sneaky bits. I retired in 2K. I have heard this SAME pitch no less than four separate times during my working life. Now we are getting it every 9 years or less. It sounds great. It isn't. Why did the Bismark not be able to shoot down the painful slow British Swordfish torpedo planes? Analog computers. True. They are not reprogrammable in the field. Never were. These aren't either. Yeah patch cords that are tiny, such an improvement. As to Moore's suggestion from 20y ago, the industry blew right past his concern. Watch this, don't share it with the analog evangelicals, not without a medical team nearby th-cam.com/video/Gzkb3Zc8pGE/w-d-xo.html
As part of my Master thesis I analyze analog computers to determine weither they may be better to set a cognitive architecure, instead of digital computers. I find it really fascinating and do believe philosophical reflexions about the discrete/continuous structure of reality and human experience are key points in the machine consciousness field
Dallas OR and The Dalles OR are about 150 miles from each other. One is rain forest and the other a desert, that's a pretty huge distinction when talking about water usage.
For impact on the world of processors, more than analog computing I'm looking at backside power delivery which finally separates power circuits in a chip from computing circuits, and more advance designs like the Mill CPU (still coming after >10 years) which re-attempts VLIW and promises GPU/DSP chip efficiency brought to a general purpose instruction set.
I foresee a computer that will mix analogue, digital, and quantum computing in such a way as to play to each one's strengths. In particular, you could use the digital chip to calculate how to program the analogue chip. I have no idea if this would use less power than simply doing it with the digital chip, though. Perhaps you would lose power for a single calculation but gain it if the solution only had to be slightly modified for each of a large number of calculations.
Would it be possible to integrate analog computing into digital computing systems? I know there is some music hardware these days that uses digital control of analog parts, but I don't know if that's applicable to analog computing.
9:07 We will get down to 1 molecule, then smaller molecules and then integration begins. Like files are compressed. Smaller, less resistance allowing lower voltage.
Should we take the word of analog evangelists as gospel? Or are we better off waiting for flying cars? Secure your privacy with Surfshark! Enter coupon code UNDECIDED for an extra 3 months free at surfshark.deals/undecided
If you liked this, check out How We Solved The Home Wind Turbine Problem th-cam.com/video/SQKHJm7vd4E/w-d-xo.html
Here's food for thought for you .. not my opinion just a thought ... Remember The Jetsons? Don't remember seeing any non-white .. it is what it is 👋
When I think analog? I think vacuum tubes
No
You should do some redeem BrainChip and their Neuromorphic Chip based on Spiking Neural Network. It has just been launched into space with with a satellite called ANT61. It uses micro watts to a few watts of power
@@Sekhmmett no, what?
(circa 1960) "It would appear that we have reached the limits of what it is possible to achieve with computer technology, although one should be careful with such statements, as they tend to sound pretty silly in 5 years."" - John von Neumann
The current time is *always* at the limit of computing power. That's how time works.
We just won't be able to brute force it at worse. Still a myriad of technologies to somehow make it faster. Through better architectures of the chip is one of them.
Maybe we will figure out how to program quantum to computer to more standard things. As of now they are only better for specific tasks.
The difference here, of course, is that Von Neumann wasn't dealing with the limits of physics, just their current technology. We're making silicon transistors that are approaching the hard limit of their atomic structure. We'll have to figure out a different way to scale computing power outside of shrinking silicon based transistors.
@@jollygoodfellow3957 Same is true for lots of stuff like car engines, we make lots of them and its great initiative to improve them.
Make an car engine who is more powerful, use less fuel and pollute much less. we done it. Its just much more complex than an engine 50 years ago.
Make it an hybrid to increase all 3 benefits again.
And analogue computers are amazing, the Iowa class battleship continued to use their 1940's mechanical analogue computers.until deactivation. They had digital fire control computers for 5" guns but newer one has higher muzzle velocity and the 1980 ones just replicated the mechanical ones cheaper and smaller so you could use them even on towed guns. I imagine modern fire control systems are much more advanced and you have apps for snpers to use :)
@@dianapennepacker6854 i think in far future there will be smaller then atom supercomputers that will work faster then speed of light as they will be inside one atomic particle ;P other then tat i dont see much future for improvements...
The biggest drawback of analogue circuits is that they are quite specific to a problem, so making something that will work for most/generic problems is difficult, where a digital computer is trivially easy in comparison. But, when you need something specific, the analogue computer can be significantly faster and more energy efficient. I look forward to the hybrid components that will be coming out in the future.
Exactly. I don't find small analog components in devices bad, they are useful, cheap to make, etc. Though yeah, you can't have generic computing on anything analog, and really all the progress happened through generic computing. Abandoning digital generic computing would be incredibly stupid. Like, what, we'll get back into mainframes to program something? They are kinda pushing for this with the cloud, seems like they don't want people to have access to computing, they wanna take this away. Returning to an analog world is gonna be really really bad. Not to mention that overly complicated analog circuits are really difficult to design, so if you are say running a digital AI in some datacenter, it'd be a big help for you against others who don't have access. and yeah, they could go further and design analog computers for AI, more efficient, designed by AI, for certain parts of their models. Maybe AI itself will be adjusting knobs for node weight values or connect/disconnect circuitry, who knows. Though taking all those generic computing possibilities away from people, is just wrong and it seems like this is what they are after. I know it will happen eventually, they are anyway taking things away from people, one way or another, including knowledge, science, communications, now generic computing, it's all gonna be taken away eventually. Govs like to take away things, until they realise, those things they take away from their people, eventually it's gonna hit them back when it's too late, and they are gonna be taken away from them too. They might be allowed to just exist, while others will obviously continue their progress, albeit at a slower rate since the majority of people don't participate, but they'll still be moving forward, the rest backwards, you can see where this is going - if it's not happened already.
@@StupidusMaximusTheFirst You reminded me of a useful point: Analogue circuits need to be calibrated regularly because of component aging. This is a problem that was bypassed with digital circuits and a prime reason they are so common now, lower TCO!
@@billmiller4800 I wonder how feasible it would be with some combined analogue-digital circuitry to embed regular self calibration into packages themselves
No need for hybrids. Digital is king for anything that is not very trivial.
@@rogerphelps9939 True, in most cases, but the entire point of the video is that there are cases where analogue is much faster and energy efficient, so we should keep an eye open for things that can make our battery powered devices last longer.
I studied analog computers/computing in the 1970s as part of my electrical engineering education. At one time (after that) I worked for a company in Ann Arbor, Michigan that made some of the most powerful analog computers in the world. (I was in marketing by then.) They were used, among other things, to model nuclear reactors and power plants. Incredibly powerful.
They may be powerful, but still lose to digital computers whenever true accuracy is needed.
I would not trust any analog system in designing such high risk objects as nuclear plants.
Digital computers have already established a plethora of error-correcting mechanisms, many of which only have to jump in a couple of times a week, whereas analog will inherently produce ever bigger errors with every single calculation. It all just depends on what is the tolerance we'll agree to tolerate.
I only see analog in really specific applications. Maybe some ASIC chips or contraptions, but the digital is there to stay and govern the analog.
Same. We used them to solve differential equations. I'd been into digital computers for a decade by that point. I thought analog computers were nifty, but at the end of the course, big shrug and move on. It opened my eyes though, and I've never forgotten that.
@@shapelessed but if we look at much of the modern AI they are probabilistic in nature so the use of analog systems in this environment would have a negligible impact on their performance.
@@astr010 "probabilistic in nature" yes, but the probability is based on accurate math. its just "probable" because we either lack the data or the computing power to make it exact. in theory it COULD be exact.
@@shapelessed are they inherently more inaccurate for their purpose though? For example the abacus and slide computers shown in the beginning seem to be fairly accurate and produce reproducible solutions. I’m not sure where the inaccuracy would be introduced in more complex systems and certainly not with *every* calculation.
1:32 No, it may seem like a infinite set, but in practice it's limited by the signal-to-noise ratio, the SNR is efectively the number of "bits" of a analog computer, the rule of thumb is 6db ~ 1 bit, also each component adds its own noise on top of the signal, you lose "bits" as your computation becomes more complex, BTW that is also kinda true on digital computers, if you use floating point numbers you lose some precision with each rounding, however on digital its easier to just use more bits if you need them, on analog decreasing noise is not so trivial
Absolutely right.
In analog you can also increase the distance from the minimum and the maximum, which does a similar thing, but that is also not so trivial... you would not want to work with 20000 volts or 1000 psi :D
Another problem is keeping noise stable in order to have a working calibration. Every physical property change can affect it. You woukd not want it to give a wrong result just because summer is too hot this year.
Precision is not the point here; on the contrary, the goal of analog computer is deal with unprecise values in a more subjective way. It's more close to fuzzy logic than the precise math.
@@joseeduardobolisfortes
Um, If you don't care about precision we can just use low precision approximations on a digital computer though?
Sort of, but not really, and not practically, which is kinda the point
I think the best way to sum it up is: Analog is fast and efficient, but is hard to design (or at least formulate the problem). Once you build it, it is only good at solving that specific problem.
Digital on the other hand is much more flexible. You have an instruction set and can solve any problem that you can write an algorithm for using those instructions, so you can solve multiple problems with that same machine. Tradeoff is the mentioned slower speed and efficency (compared to analog).
My favourite story about digital vs. analog is the Iowa-class battleships: They were built in the 1940s and were reactivated in the 80s. The fire control computers (electromechanical analog computers using cams, differentials and whatnot) were state of the art back in the day, but given the invention of the transistor and all that since then, the navy did look at upgrading them to digital. What they found is that the digital system did not offer greater accuraccy over the analog. While over 40 years technology advanced quite a bit, the laws of physics remained the same so the old analog computers worked just as well.
Oof. No offense, but as an analog electrical engineer you should have consulted with experts instead of press releases from marketing before making the video. Analog engineering is about tradeoffs, you can minimize power consumption but will lower almost every other figure of merit. You should also at least mention electrical noise, since it is one of the biggest downsides of analog compared to digital circuits.
Artificial intelligence is such a misnomer. Ultimately it’s a bunch of _if then else_ statements and eventually it will run out of stuff. I can’t create a thought. This makes it stupid. Artificial stupidity
Why are we wasting our time with artificial stupidity when there is a an abundance of naturally occurring stupidity??
Someone turns the microwave on and every analog computer in the house has a meltdown.
@maxweber06
How does a microwave affect analog.?
To reduce the influence of noise in analog calculations, you need to use phase encoding instead of pulse (amplitude) bit encoding. With phase encoding, the ALU will operate on one frequency. 1 and 0 will be phase-shifted sinusoidal signals.
I think the point here is that analog design is not anachronistically proposed as an alternative for areas where digital design has already established itself out of several reasons like signal processing where accuracy and error correction are pivotal. Modern analog research rather focuses on young areas like AI where exact processing is much less important than radical parallelization.
I built and used analogue computers (although I didn't call them that) to control the temperatures in a multi-layered cryostat for my Ph.D work in the mid '70s. I did the number crunching on a digital computer that filled the basement of the maths dept building using punch card input. 😮. 20 odd years later I found myself working with an engine management computer for a helicopter that was pure analogue. When I approached the system engineer at a large well known aerospace company who had the design control authority for the system to ask some fundamental questions about it he didn't have a clue - he was purely digital. I'm retired now but if drag my knowledge out of the closet along with my flared jeans and tie-dyed T-shirts perhaps I'll come back into fashion. 😁
It is precisely people like you who need to start channels so that pure digital people can see how you approach a problem and capture your invaluable decades of extremely useful experience and value so the world do not lose it. What a pity if we should have too few people that can help us remain adept at such things and continue to reap its benefits, and to show everyone how the world got here in the first place.
I might not be an analog guy, but I would watch your channel any day.
Come back - do not let that knowledge disappear
I think automatic transmissions are essentially analog computers?
In contrast I think AT is a perfect use case for analog computer where a digital one might be more prone to error?
@@RickBeacham I guess so because it uses hydraulics. And then there's my Alfa Romeo 147 which has a Selespeed gearbox which is an automated manual transmission. It consists of the standard (manual) 6-speed gearbox with the standard clutch and adds an electronically controlled hydraulic (robotic) that actuates both gear and clutch. I guess that makes it a hybrid analogue/digital computer.
I agree. Its like comparing apples to oranges
Analogue computing is analogous to the P vs NP problem in pure mathematics. It is fantastic at anything which is hard to calculate, but quick to check. I'm this case, anything hard to figure out how to express, but with solidly defined parameters.
It works by shunting a good deal of the difficulty solving the problem up front to the designers of the machine. It can take years of incredibly hard work to figure out how to express a single problem in analogue form, but once you DO computing the answer for any combination or variant of the problem is virtually instantaneous.
No it is not. Offset and bias currents together with component tolerances will ensure that any analog computer can be easily outmatched by a digital computer.
My analog computer has ten fingers and ten toes.
lol your computer has digits
Wouldn’t that be digital 😂
What has 10 toes but are not your legs??
My legs!
That’s kinda mean to talk about your S/O like that.
And does the dishes
I started my career with analogue computers in the 1970s as they were still being used in industrial automation for motor control (PID: Proportional, Integral and Differential). I worked in a repair centre and built some test gear to allow me to calibrate them. It's no surprise to me that they have come back, within certain niche applications they are very powerful, although not particularly programmable, unless you count circuit design as programming :-)
And there's nothing really new under the sun. I studied computer science in the 80s and we learned analog computing in a similar way. The 80s seen operational amplifier as integrated circuit, but that's all. OK, these days there are pretty fast ADCs (e.g. Delta-Sigma), so Hybrid systems could be faster. Main caveat is the mathematical understanding of the problem. No way to start anyhow programming python hoping to solve the problem with tons of iterations. ;) Math is king here.
Gave a talk at DataVersity on context. Explained in the 70's we could build a nuclear power plant instrument panel with gauges of different ranges - this allowed the techs to scan the whole wall and immediately see if anything looked out of normal range (usually straight vertical). However, everyone wanted digital, not realizing that with that change, each number had to be individually read, consciously scaled and then thought about (compare with 'normal'). With digital came the necessity for alarms because it took too much mental effort to scan the wall. Something few consider to this day...
So few even slow down to consider anything in these days and times..
@@Wil_Liam1 you can easily creatte analog looking displays with digital tech.
@@rogerphelps9939now adays, yeah it’s trivial, but back then I imagine they only had text interfaces.
Basic HMI stuff. With an analogue gauge, you get 3 things straight away:
1. The current value.
2. The range of the values (and where the current value sits within that).
3. The rate of change.
You have to work that out each time for each digital display. If you want an accurate reading - digital, if you want to monitor a whole lot - analogue.
I use analog displays that are digitally sourced at my job. One nice feature that has been programmed in is that when a limit is exceeded, the color of the display changes. Depending on the severity, the display may be yellow or red. I realize this was readily available in the 1970s though. We started seeing it in our products in the late 1990s/early 2000s.
An analog clock is not continuous, its movement is a function of the internal gear ratios and the arm moves in discrete, quantifiable steps
That's true for some clocks (mechanical with an escapement) but not a 1950's wall clock that runs off a 60 Hz synchronous motor. It's an interesting thought problem though... is an hourglass digital or analog? The answer is digital.
it moves _to_ discrete, quantifiable steps, but it still moves instead of snapping
Gears are not true analog. Use a belt instead.
Just like an emulated sine wave is not a true sine wave it’s stepped and smoothed but it will never be a pure true sign wave because switching causes steps
Digital systems can approximate analog movements but never emulate all possible steps because it’s literally incalculable because you can never break a analog signal into a digital signal because as accuracy goes up size of each discrete unit goes down to a point where it is operating at a microscopic level
in the case of audio playback, the stairstepping of digital doesn't matter past a certain sample rate as the limited physical movement of your speakers will smoothen them out if your DAC hasn't done so already
I'm skeptical about the broad assertion - 'Why the Future of AI & Computers Will Be Analog' - that analog computing will dominate in the future. Analog computing clearly has its niche, particularly with tasks involving continuous, real-world signals-such as audio and visual processing, or interpreting sensor data-where this technology presents a clear advantage. However, framing this niche strength as 'the future' and implying a universal superiority over digital computing seems a bit overstated to me.
Man i get the clickbait game, but it's nauseatingly naive and insultingly ignorant to assert that any form of analog computing can match or surpass the real teraflops and petaflops companies like OpenAI / Microsoft Azure are doing on the real machines that make LLMs possible and scalable today. Pardon me whilst i vomit.
These are just pointless forecasts and guesses. The journalists are (or act like) literally on drugs! On a hype. You can't take them serious when talking about computer science. They've become the extended arm of the marketing departments and the willing barkers of the pseudo-tech bros. You are right being skeptical:) Just take that sloppy remark about Cryptocurrency or energy consumption at the beginning. Matt didn't even question that some of the use cases are really just circle-wanking and/or money-printing scams, that are very bad for the environment AND for the people. Let's call it A.S.: Artificial Stupidity!:) (In contrast to A.I., a thing that actually exists in the real world, hehehe)
Have a good one, Ben!:)
Edit: Later in the video, the part about AI (around 15:00 ): "(for an analog computer:) Rather than ones and zeros the data is retained in the form of voltages." ... which is the same in digital systems and the basic principle of DRAM, that has to be refreshed because the voltage of the storage element (the capacitor in the (C)MOS switch) drops because of natural existing leakage. That will be THE SAME for analog computing. This is possibly information from the "Mythic" company, that Matt repeats.
So, I doubt that they do not know the basic physical and electronic principles that make their prototypes tick. This is most likely a straight up lie to deceive the public or investors. That's what we're up to ... Nice guys, nice industry ... don't you think?!:)))
Maybe because our own "AI" runs on analog and with a lot less power which is a looming problem with digital.
Whenever I design electronics, I often use analog preprocessing, since it takes much less energy to amplifie, integrate or filter signals with OP-amps using summing, PT1-, PT2-models, integrators and differentiators instead to using convolutions or FFT to construct FIR-, or IIR-filter, that needs a lot of processing power.
That may be the case but the results of using analog will still be inferior.
Every time I hear the word convolution my so soul dies a little..
@@rogerphelps9939 Any good engineer knows that the quality of "superior" or "inferior" is multi-dimensional. The writer said that analog used less power, and that is very true most of the time. Analog will be less granular than digital, if accuracy is important. Analog is generally faster, if speed is important. If cost to perform a calculation is important, then digital is a better choice most of the time. What led you to your conclusion of "inferior"?
@@rogerphelps9939 Clearly you are not a well-informed engineer. Superior or inferior all depends on the intended purpose and the parameters of interest. Example parameters might include required accuracy, expected lifetime, running costs, maintainability, energy use, environmental consequences, ease of use, human factors and so on.
kids Nower days using Arduino forgetting that to blink an led , one can use bistable mode of op-amp
Great video, thanks for sharing. The biggest problem with analog computers is there are so few people that know how to work on them. I’m reminded of a hydro electric plant I toured once that had an electro mechanical analog computer that controlled the units. At the time I visited it was already considered to be ancient and they were actively attempting to replace it simply because know body knew how it worked. They only knew how to turn it on, off, and wind the clock spring once a shift the exact number of spins that it needed to keep running. They had been trying to replace it with a new computer but none of the many attempts could match its precision in operating the plant and maintaining proper water flows. They were in constant fear that it might break. I checked back maybe 20 years later to ask how it was and know one working their knew what I was talking about. Sad that it was long forgotten by everyone at the plant. I thought it should have been retired to a museum, and still hope that possibly it was.
Analogue computers are single purpose, where digital computers are general purpose.
They belong in museums.
@@bzuidgeest perhaps you are right, but some applications can be very specific like the one mentioned above at the power plant. Analog computers can be very power efficient, so very cheap to run. They have their merits
Many people don't know how things work and still be able to use them... such as a car or a kid playing a video game... So the goal for said "analog computing" is to somehow implement them for the masses such that they won't even need to concern themselves with the underlaying tech... My point is that it's not the users that is the problem, but rather it's the implementation of the product or lack of that's the issue.
@@bzuidgeest Isn't that more an issue with implementation? Digital computers have been designed to be general purpose for decades, but the first ones were single purpose machines. Modular Synths and The Analog Thing show how you can alter analog computing to do different things based on how you patch them together although they connect modular functions using wires. It seems that the companies looking into analog computing are trying to do a similar thing on the scale of integrated circuits.
@@Tsudico maybe, but a digital computer does not not have to be rewired and the same block of transistors can do many calculations. Analogue computers, indeed like synths need to be rewired every time. You could add some analogue switching controlled by a digital controller... A hybrid. These things are difficult to define.
When I was in the Navy the ships I was stationed on had a single gun on the bow. This was a 5"/54 and was operated by an analog computer. If you watch the SyFy movie Battleship you can see this type of aiming a large gun round at a given target.
One of the big advantages of analog computers is that they are not subject to EMP.
Well they are but they recover immediately unless the pulse was strong enough to blow a component.
@@esunisen3862 exactly - most electronics will just reboot. If you are under the blast there is enough energy to burn out circuits - even analog. I did EMP Testing on the B1-B in the late 80's.
@@rockpadstudios what about diesel motors
@@canobenitez How basic is it? Any diesel car/truck/whatever will still get fried, if it's got a digital controls, fried... anything with any electronics will probably just be dead.
It's important to understand in the clock example - that even though the clock hand (second hand) was moving discretely (moving only every 1 second, and not continuously), it still is analog. Because it is not necessarily the continuous nature that makes something analog, it's if the physical process happening is _analogous_ to the real event. The twist of the second-hand is directly proportional to how much of a minute has passed. However, in a digital clock, the voltages in the bits (0-5V) are not analogous or directly proportional to the time that has passed.
6:43 so far I feel like this video isn't about analog vs digital computing and more about mechanical vs digital (not even mechanical vs electronic) ... Both digital and analog computers can be done in both electrical and mechanical ways. Hybrid computers are something that will happen though.
Playgrounds in the 80s were amazing. My favorite was the spinning metal platform of death. That was still safer than sharing a seesaw with a betrayer, that thing was coming for your chin or they tried to smash your spine on the ground. Good way to learn which kids couldn't be trusted. Flying off the swings to land onto gravel and dirt and get cool scratches. I was always bleeding a little bit as a kid, a result of good times :)
Your reference to Amdahl's law seemed wildly out of place. It deals with how parts of some problems cannot be done in parallel and the fastest and problem can be solved will be as fast as the slowest serialized work. This holds for digital or analog computation.
Amdahl's law deals with adding more computers (cores in the case of modern digital CPU design) to solve. But adding more analog computers (or any parts performing calculation at the same time) to a system would still have slowdowns for work that needs to be serialized. Consider MNIAC and how it deals with water flows. if some part of it needs a flow that last so many seconds it doesn't many how many other flows the creator managed to do in parallel the fast runtime will be at least 3 seconds.
To continue this terms like "critical section" have specific definitions in computer science that simply aren't applicable here, they deal with synchronizing the work of multiple threads and cores and would have analogous design considerations in analog machines that have multiple parts contributing to the same computation.
As one who has programed and used analog computers in the past, I can only say is "I told you so." They are the most powerful computers in the world. It took a while, but I knew they would be back. Welcome back my old friends. Do great work.
What people used to call "analog computers" were just hardware differential equation solvers for the most part.
It's all gonna be weird af crystal megastructures powering supercomputers in the end
Ya dude, I'm seeing crystals and photonic computing as big part of the future.
Im going to incorporate this in my sci fi novels
Also makes me think of the pylons from star craft
9:52 that's not truly an analog clock either, it only shows the time every second. Like as in the temperature example at the start, if it was analog it should be able to show fractions of a second too and change continously, but it only does in 'bits' (seconds).
I think analog has a great future in large language models and other AI, which is fortunate because generative AI is currently the fastest growing use of digital computation and the energy it requires. The evidence for this is that LLMs continue to function even when the weights and computations are quantized down to, say, 4 bits of precision for models designed to run on small systems. Four bits corresponds to a need for voltage accuracy of only 6.25% in an analog implementation, which should be easy to achieve. I don’t know how easy it would be to implement algorithms like the back propagation, which is used in training, in an analog form, but my guess is that it shouldn’t be difficult.
in future ai has to reduce its main code and simplify it :D otherwise it will start to become confused then start acting like old people under influence of demantia .. just a guess :D if there is any code atleast :P as llm is a complex structure.. and there is no actual code in lots of models but instead database used as some kind of script which is still code but logic meanings cant be converted to machine code directly... well thats what llms are indirectly designed to do :D os may be they can translate that to standard code themselves before it becomes impossible to translate/simplify :D
Sadly, that's exactly where analogue processors fail. Analogue circuits are way less scalable than transistors, shrinking an analogue chip is incredibly difficult (see Apples modem division). Also, for Ai, as it is at its core a many times over repeating number of operations, its sensitive to consecutive errors, as introduced by an analogue chip. Mythic, as far as I know, gave up on their big AI dreams and is focusing on smaller applications, some still AI adjacent, not as impacted by the problems of analogue error. Analogue has its place, but general analogue computing is very far away if ever reached, Asics are the digital answer to analogue, still digital with their benefits in scaling, without the harrowing complexity of analogue and benefits of fixed function efficiency. Analogue in chip design is the dream, but as the laws of scale punish the transistor, they hit analogue many times over, while the math of chain rule pummels it to the ground in most cases.
Digtalk still wins. Nlo need to worry about bias currents and offsets or component tolerances, all of which scupper anything but the most trivial of applications.
@@darlokt51 I see your point that backprop may not be able to be quantized down to as little as 4 bits because of the chain rule. But the real reason I'm back to this thread is to flag a public lecture I just watched by Geoffrey Hinton (who anyone involved in AI will immediately recognize) posted a month ago on the University of Oxford TH-cam channel. Starting at minute 27 he talks about digital v analog implementations for almost ten minutes - his conclusion being that digital is better for future AI despite the potential advantages of analog.
@@darlokt51 Another problem is translating analogue over to digital as that requires bandwidth and processing to do
Although I'm now retired, I well remember studying analog computers in the 70s and having to perform the math in order to correctly program the system. It is true that the accuracy of an analog computer is far greater than digital, chip/computer designers elected to choose speed and convenience over accuracy. It is only today that society is beginning to appreciate the advantages of analog systems, a great example is the rediscovery of music from turntables and albums. There are many aficionados and music lovers, like me, who still enjoy the sound of an analog album far more than its digital counterpart, regardless of the digital format used to playback the digitized performance. I'm typing this on a regular laptop, and I recognize the convenience of it, I'm just unconvinced that its the be all and end all. I believe there is a huge future market for analog computers/circuits that has been awaiting the need for greater accuracy and dramatically reduced power availability, measuring the health of bridges comes to mind; thousands of sensors reporting on stresses within individual components, all done in real time". At any rate, thanks for this insightful video, as always it was thought provoking and time well spent!
Analog computing certainly has its uses, but not even hybrid devices are going to replace the main use cases of supercomputers. For one, measurement is as precise as the measurement device, which is basically always discreet. For example, if it can measure mass in nanograms, then any difference smaller than that cannot be measured. However, more importantly, analog computers are task-specific machines. Making task-specific digital chips has historically been incredibly expensive, but that's changing now, with Intel, TSMC, Tower and others making custom chips. Custom digital chips built to solve specific problems can also save 95% and more of the energy compared to general-purpose digital computers - it depends on the problem.
Matt, you should have started with showing an op-amp differentiator or integrator circuit in operation to show that analog circuits do not "compute" they simply operate. Sure, most people are not electrical engineers, but that would show people what you're referring to. As your video is now, I'm sure that most people can't conceive of what an analog computer is comprised of.
This video takes the cake of the most buzz words and least amount of substance that isn't talking solely about 'a.i.'..
There is one new area in which analog computing could help us: the evaluation of neural networks. Because neural networks are fundamentally analog, and are currently simulated digitally. It would be quite feasible to put a tens of millions of analog electronic neurons on a chip.
One of the biggest hurdles to analog ASICs has been the extreme difficulty of designing the IC circuitry. In fact, the most expensive and labor intensive part of a processor has always been the analog circuitry to regulate and supply power, among other things. With the help of AI we’re going to see a massive explosion in this field, since it can easily pull from natural laws, fab constraints and desired outcomes to take much of the guesswork and iteration out of analog ASIC engineering. Literally machines building the next generation of machines.
Analog computers rely on physical properties like voltage or pressure to represent data. These physical properties can be influenced by environmental factors like temperature or noise, leading to errors in calculations. Since analog signals are continuous, analog computers can't achieve the same level of precision and accuracy as digital computers, and we need that for medical purposes. this is why tech scientists are still going the digital way, but this time we are trying to work with photons, which is faster because it uses light instead of electrons, making heat and energy less...and of course making transistors smaller and smaller cannot go on forever, that's also why companies like ASML and intel are instead making transistors smaller, they are making transistors more complex, like patterning.
Good comment! I'd add that there is a classification of analog called switched capacitor circuits which are not continuous. They're typically used in adcs these days but were all the rage for a while.
Large analogue system's will probably be better at predicting / controlling chaotic events. The whole point of "expert systems" is to digitize more analogue like actions and observations.
why are you using an apostrophe to make words plural?
@@pesqair oops, edited. thanks.
While working for Texas Instruments, I spent six months learning about the SEL 32/55 hybrid (analog and digital) computer. Ultimately, I worked on the digital side as a systems programmer. I left TI when they planned on transferring me to work on the Ada compiler. I wanted to be a VM/370 (BSEPP, SEPP, SP, etc.) systems administrator/programmer. I took a job as the lead systems programmer at the Owens Corning Fiberglas Research Center in Newark, Ohio. I loved the job and working with the IBM user group SHARE.
Like, the first 75% of this video talk about what analog computers are not. Add some snappy comments and twist, and you still don't come close to what is mentioned in the video headline
Excellent presentation. Thank you very much. As one who grew up with slide rules and Heathkits which I still have and use, it is a joy to follow your explanations. I applaud your use of the more precise and historic philosophical description of the distinction: discrete vs. continuous. Digital is really not exactly the same concept as discrete. In the 1950s as I was growing up and cutting my teeth on electronics courtesy of the US government selling complex devices that originally cost thousands of dollars for a quarter a pound, I also enjoyed building lots and lots of Heathkits. These collections of discrete (there's that word again) components and excellent manuals educated post war generations in the practical fundamentals of electronics and saved many dollars over factory assembled gear. I coveted the analog computer kits Heathkit marketed, but I never built one. I was thrilled to see The Computer History Museum in Mountain View, CA has one on display. It was all I could do to stop myself from twirling it's knobs and dreaming of the warm glow of its hot electron field effect devices, differential amplifiers, patch cords, and more. One observation on your presentation: You refer to a slide rule as an analog device, which it is. But in the same breath you declare an abacus also an analog device. While I'm not a skilled abacus user, I think the positions of the device's beads are always assigned to specific locations of clearly limited number; therefore calculating with an abacus is a discrete (though not digital) technique.
The large hall-sized computers you show at the start of the video are actually digital computers. They just used vacuum tubes instead of transistors so took up a lot of space and used a lot of electricity.
In 1971 analog computing was one of the engineering classes I was required to take to get an engineering degree. Generally the process was similar to what you show at 10:27 in your video. It was very helpful to use when looking at dynamic systems. You programed the system by turning dials and moving jumper wires so there was no easy way to store you program and reprogram the system. To create a practical analog computer you would need to develop digitally controllable capacitors, resistors and transistors.
I wish there was a betting pool for every time Matt Ferrell was proven incorrect because he believed some marketing hype. I would be a billionaire.
You mentioned using analog computers to calculate heat flow in objects. I remember using nonlinear partial differential equations to model heat flow in a metal bar from a point source. Bessel functions were used in the process of calculating the heat flow.
I personally view analog computing as simply "unprotected digital computing".
Correct me if I'm wrong.
Digital computing is essentially gating electricity such that you can move electricity along logic corridors to create predictable outcomes by compressing a whole range of analog signals into 2 (zero and one).
The compression is an engineering solution that allows us to concern ourselves much less with signal clarity and correction, and deal instead in these coarse bits.
Of course, the compression and gating also is a huge waste of the actual electrical signal. The analogy is like trying to watch TV through a white blanket sheet.
Instead of the high fidelity of the electrical signal itself, you get 0 and 1.
Taking off the protective layer provides us with a ton of untapped signal to play with... but it also comes with its own share of hairy issues. Noise and such.
Signal correction?
Don't know, just commenting to boost your comment so someone smart notices it and has a chat with you about it so I can glean so dumb-people nuggets from the chat :)
Some semiconductor engineer (dont rememeber his name) said that the advances in semiconductor technology can be boiled down to discoveing new techniques of making the circuits run at higher frequencies while preventing them from falling back to the analog domain
Not really. The difference between digital and analog in terms of computing is that in digital computers you use symbolic representations composed of discrete bits, transformed during discrete clock cycles (which in math kind of maps to the realm of integer, sequences and so on), while in theory an analogue system express values in a full continuous spectrum and operate on them in a continuous way (the real number line, differential equations, and so on). What you're talking about is that we implement digital logic using what is an analogue system (voltages that gate transistors), so we discretion values (binary, trynary) and make multiple parallel channels (bits) and clock the system. You could however implement a digital computer in a system that has finite discrete states, it's just that transistors are very fast and cheap and scalable and actually do not consume any power when fully on or off (which is the ideal state of a digital computer).
In general digital gives you the best precision but also the highest versatility, because symbolic calculations can represent anything you want given a certain precision. Analog system need to be tailored to every specific problem by rewiring their underlying system to represent the problem (or just be a fixed accelerator), and while they have advantages in calculating certain problems (that describe physical systems) efficiently they are not efficient in other problems (the ones that map to discrete quantities). In fact while we know that every Turing complete digital system is equivalent, there have been also studies that have shown that you can map any analogue problem to a digital version and vice versa....it just says nothing about the complexity of the transformed calculation.
TLDR: you're confusing a bit the physical layer with the conceptual layer, but indeed since voltages are an analogue quantity when you restrict them to represent discrete values you lose something the analogue system could do very naturally and efficiently, but this is a quirk of the specific system
@@fluffy_tail4365 The physical reworking issue got me wondering about modular analog systems that can be quickly transformed into different forms. Seems like a digital system could manage the transformation of analog systems into whatever form suited the need. I am obviously out of my depth here, just an idea.
@@fluffy_tail4365 Like the way Alphafold figured out protein folding. A digital system can be used to alter a physical form of coding information that does stuff in the real world.
Be very careful with the desuperheater on your home HVAC system. These systems use a simple heat exchanger to transfer the heat from the heat pump working fluid to the domestic hot water, but the catch is that this happens at high temperatures (duh). If you have hard water (many, if not most, municipal water supplies are hard water to varying degrees), the heat exchanger will quickly cause the minerals to plate out of the water onto the heat exchanger surfaces. The result is that the heat exchanger becomes less efficient over time and will eventually clog up completely. To combat this, you would need to add a water softener with a high enough capacity to feed the hot water supply. Such a water softener is very expensive to buy and maintain. The alternative is to have your heat exchanger cleaned out every year, which is time-consuming and messy. To be fair, electric and gas water heaters also have this problem with their heating elements, but electric hot water heaters have cheap and easily replaceable heating elements, and gas tanks typically directly heat the the outside of the tank, so there is nothing that can really clog up.
a general purpose, multi-function device is great, but it'll never be as good at a specific task as a device built to do that ONE task as efficiently and effectively as possible.
As for analog physics, think of what's faster, pushing a button that activates an actuator that pushes a ball, or just pushing the ball yourself?
A good complementary development to the modern analog computer is the ongoing "second age of microcomputers", which has had a watershed moment lately through gradual consumerization in custom PCB design, low-cost microcontrollers and FPGAs, making it possible for hobbyists to manufacture dream machines deviating from the mainstream PC architecture. There is even a project called "Tiny Tapeout" that allows you to publish an ASIC design and get it anthologized with hundreds of other hobby designs on one chip.
There's a whole menagerie of retrocomputer-styled designs being made using these components, from the little ESP32-VGA, Agon Light and Neo6502 to more ambitious FPGA boards like the C256 Foenix and Commander X16. At first glance, they're nostalgic toys, but they signal a further decoupling of personal computing from industrial applications, an important counterbalance in an era where everyone complains about how much our devices and data are not really under our control.
I invented an analogue computer myself, back in the 70's. I proposed using the pulse frequency of a rectangular (digital) waveform as the physical variable. Summation would correspond to integration, driving a pulse generator. Differentiation would be via subtraction, again driving the pulse frequency.
I'm not sure whether it would work or not, but it could be very useful if it does.
Oh yeah totally, differentiating the pulse, I totally understand what that means and can see the potential ;)
Smart people like you confuse me, how are we the same species? :)
Why not build a test model? It could be a big thing if it works.
@@GordonSeal I tried, in Python, but I am not convinced that I got the model correct. It needs to be asynchronous, but wasn't. I'm unable to do much now.
I also suspect that A/D and D/A components will be required for the pulse generator interface, to ensure the asynchronous nature of the entire system.
After further thought, the best approach may to simply to charge or discharge a capacitor via the pulse charges, to control a pulse generator.
That may actually work, but may not offer any advantages.
@@HoboGardenerBen sounds like you need an analogy, to understand it.
@@dominus6695 ;)
Haveing used analogue computing in the 70's I can say they are real pain to use for any "quick" investigation . You are allways hitting over voltage saturation or the signal drops so low you hit thermal noise.. To get them to work is a lot of fiddling.. This is just a case of the grass looking greener over the fence. I will stick with digital proccessing with a anolgue front end to get the signals in a form the digital system can handle well . ( best of both worlds).
So how does this analog revival mesh with the emergence of quantum computing? I'd love to see a follow on episode talking about this. Love your content!
I've often felt that Quantum Computing was just a really fancy method of performing analog computations. Yes, you get more direct control over effects like quantum entanglement, but that also comes with the downside of making the computer hard to scale. I expect we'll have powerful analog components in wide usage long before the GA quantum computer is cracked.
Quantum computing is about making digital act more like analog.
Isn't an abacus a manually operated digital computer? The beads are moved by hand but they still represent an on or off state. The slide rule is an analog manually operated computer. I used them a lot before we got digital calculators. When I was in the Air Force we used analog computers (large ones) and they were fantastic machines.
I was most fascinated by the analog chips ability to operate really effective & efficient image recognition software. *BTW did u hear about the lost Disney sodium vapor method? How it's been rediscovered. The Corridor Crew YT channel just came out with a video where they collabed with a guy who figured out how to do this lost art in filming and media. It's actually a game changing aspect. I recommend looking into it. It gives us the ability to not have to stress and mess with all the green screen effects just to still have those look off.. I had no idea how fascinating this lost art was and the effects it had on some amazing classic movies that had such a great unique look that modern movies just don't produce but I'm really hoping we this this method skyrocket in use thru out the film industry. As well as a return of 2D Animation and other forms of creative media.
I watched that over the weekend. It was a great breakdown. Its crazy how simple the tech was back then and how more efficient it is in comparison to todays tech.
I hear that sodium vapor has its own unique issues like requiring a very controlled environment (which you can get a sense of in the CC video). It may be better to use the knowledge of how sodium vapor achieved its effect while using a different frequency that may not have the same issue although there are limits to which frequency of light we can choose. I hear IR doesn't work due to objects giving off heat and I'm not sure it would be safe for anyone to stand in front of UV lighting for hours on end although it may allow using sun lit environments.
@@Tsudico They could probably recreate the effect with LED's programmed to output the color at that wavelength. Add in some basic software and light sensors to adjust the light to the room, and bam there's your solution.
@@ML-dk7bf One of the benefits of using other options like green/blue screen is that it can be used in natural light or outdoor environments. From what I've seen that doesn't work with sodium vapor which was what I was indicating. There might be alternative wavelengths of light that occur in natural light but that aren't used for film that could provide the benefits of the sodium vapor process in a greater variety of lighting conditions but only testing would know. Otherwise the sodium vapor technique, no matter which wavelength of light, just might be limited to specific use cases.
It is a truism, "you can't keep reducing the size of components forever."
... however, with the ever increasing speeds, I suspect you can upgrade the materials for those components... rather than silicon (semi conductor), use silver at the speed and temperature that it acts like a semiconductor...
Or switch to magnetic processing to spin electrons around the component rather than through them.
Also, a conformal coating of diamond (vapor deposit) would go a long way to ejecting unwanted heat.
@Matt, very good video. FYI: Standard digital microprocessors actually operate in an analog world. The only real differences are comparators that separate signals into high and low voltages(1's and 0's) and storage as a high or low voltage. We assign meaning to patterns of these stored voltages like the ASCII table.
This sounds analogous to a CD or DVD being digital, with each pit being open or solid (reflective), but each pit is actually a physical area with a hole either blasted out by a laser or not. That hole (or solid place) is quite clearly analog. It can be measured, and their size and placement varies by nanometers, but the digitization of the reading laser changes all those 0.000002, 0.000034, 0.000001 reflectivity measurements to 0, and the .9999997, .999999, and 1.00.000002 measurements to 1.
I guess they never get far at all from the 0 or 1. There will never be a .1 or .9, much less a .5, but with analog, the data/signal can be anywhere between 0 and 1, and it still gets processed by the next operator in the chain.
Got a THAT and there is so much fun in using it for different calculations and simulations. Only wish it was twice as big sometimes.
Your quick shout out to Kiffness video Hold onto my fur (I like it) was awesome! More cat video references please!
I think as part of the digital computers portion, the sequential bit CAN be eliminated if needed. We actually do this in a number of cases, and it's called hardware acceleration. It requires making a physical piece of hardware that can do the calculation in one operation that would normally take multiple steps. For analogue computing this hardware acceleration is in many cases a mandatory step.
I don't mind the idea of combine the best of digital and analog computing but it feels a bit short sighted. The furture isn't either but bio-chemical computing where we don't use 1 and 0 but a breath of options in the same way our brains work.
Some things to keep in mind ... an individual neuron can operate up to ~1Khz and most systems (muscles, heart, etc.) behave as chaotic systems. Side effects can be reaction time, massively parallel, synchronization, de-synch'g. Keep in mind it takes years and decades for learning how to use our brain.
I wonder a lot about the potential of photonic computing. Each frequency of light is its own data stream, so one beam can hold many streams, super cool. I suspect you are correct in thinking the overall structure of computing has to change. Just a gut feeling, I have no expertise in this stuff.
Copying living systems sounds chaotic, harder to make specific and purposeful. I'm seeing crystals and light computers, take the idea of structure to higher levels. No reason we couldn't fuse photonic and biochemical computing together, perhaps like some sort of future copy of the brain hemispheres. Seems like we want to create gods to rule over us. Makes sense, our entire history is filled with myths of beyond-human powers, we're just fulfilling the prophecies we wrote. After we experience real gods, we'll probably get over the whole idea and wish we could go back to them being imaginary.
This video reads like so many other speculative science printed media articles one used to read in pop science magazines and the like - "one day we might have solar cell collector arrays orbiting around the Earth, beaming down electrical power through invisible streams of microwaves", like. Of course nothing ever came of any of that.
Limitations with digital computers aren't necessary actual limitations, but rather limitations in ourselves that we need to get over. 8-bit digital computers CAN process larger numbers than 256 discrete states, you do it in software, paying the price in speed of calculations and storage space needed for the code and temporary data. Amdahl's law is only a limit if you can't change the nature of the problem you're attacking to go around it. Replace the algorithm to include fewer or no sequential processing steps, or subdivide the problem into smaller chunks that each can be parcelled out to individual CPU cores - things of that sort. We didn't get to where we are today by only seeing problems, instead of potential solutions.
We have digital computers because they're flexible, reliable and anywhere from reasonably to extremely powerful. Analog machines, well, not so much really. An analog computer built to do one thing can't do another (like, forget about adding new features through software patches - all the "patching" you'll be doing will be hardware), and they don't return precisely the same answer every time. Digital computers have margins of error as well, but their errors are dependable, predictable. Analog computer errors also would have the potential of changing and growing over time as the machine is used, due to wear and tear.
Will there be no room for analog computers in the years ahead? It would be silly and irresponsible to completely dismiss them (one just has to remember that famous IBM boss back in the 1950s who "predicted" six computers or whatever it was would be sufficient to cover the needs of the entire world), but replacing digital computers in general/on a large scale? No way, forget it.
Honestly, the idea of marrying Analog sensors, that can detect change without need of direct constant power, and a Digital always on Hub of information, just made me think were not far from cyborg / android bodies. The digital mind, requiring the more immediate power source (similar to a regular brain) being able to read feedback from sensor points, that dont require as much power until triggers occur that activate them. Maybe Im thinking too far out on it, but this seems closer than it has in the past.
If Analog memory can be created, where previous outputs can be retained at significantly lower energy retention costs, rather than constantly having to generate new ones, then yeah, I can see prior reference-able memory becoming a real thing, and maybe then Assembly Intelligence (what i call AI lately) can start to store past "experiences", and then were not far off from actual AI (Artificial Intelligence)
Interesting but Chilling Stuff
I started working with ML applications on microcontrollers about 5 years ago. Basically, once the algorithm was determined it was just a long polynomial. I came to the realization that it could be done with an analog circuit with basically no digital computation delay. So i think that analog has an opportunity. Back in the 40s, 50s and 60s, rocket guidance systems were analog computers. Lates 60s the Apollo guidance computer was the first digital that could have specific inputs for configuration changes.
lol no. A nightmare to debug this thing.
I literally see zero serious application of analogue computing in anything that requires actual computing. You can make a dedicated chip on motherboard for time, dade, calculator and other pretty irrelevant stuff, but for calculations such as ging, encoding or encryption you can never beat digital, regardless of the energy savings and other irrelevant "arguments".
I will be forever grateful to you, you changed my entire life and I will continue to preach on your behalf for the whole world to hear you saved me from huge financial debt with just a small investment, thank you Stacey Macken.
The first time we had tried, we invested $1000 and after a week we received $4,830. That really helped us a lot to pay our bills
Wow. I'm a bit perplexed seeing her been mentioned here also Didn't know she has been good to so many people too this is wonderful, I'm in my fifth trade with her and it has been super
I realize that trading/ investment comes with a lot of benefits than it seems to be on the news. As a beginner I was scared of loosing my savings but I'm glad I took the bold step that is now favoring me.
My whole family living fine and am heading for retirement plan, thanks to Stacey Macken .
Truly, investing has changed my perspective on how one can succeed in life; working multiple jobs isn't the optimal way to attain financial freedom and unfortunately, we discover this later in life. Currently earn as much as 12 grand weekly and this has improved my financial life. Great piece!
Neuronal connections are a combination of Analog and Digital to give some context to your video. Various inputs on a neuron from surrounding neurons in Analog signal. Finally and summated output from the neuron is digital as it is either on or off, called action potential.
So we need a combination of both in my opinion.
I'm not convinced. I mean right now AMD can show a LOT of AI processing power (AI cores) INTO a CPU, not on a big fat analog chip.
And this has always been a thing in electronics, how you can simulate analog with digital.
Nvidia dominates this world because AI is a lot more than just hardware, and no one is touching Nvidia THIS decade.
You can't just have a piece of hardware that works well, you have to have SOLUTIONS to customer problems and no one is better at this than Nvidia, they have the best software engineers working these problems and have already solved some amazing issues.
Like, find out how long it takes TSMC to make a mask, which is used to make an IC using EUV lithography and these masks have to be perfect. It's many servers running for a few days. Nvidia has gotten this down to a single server operation in less than a day. Do you know how much energy savings that is. Do you know how critical that is for TSMC to have that capability? They can charge customers a lot less money over time so if a customer has to change a circuit it's not so painful.
So THAT is what AI is, it's not a chip.
Yes indeed i remember doing analog computers at some university classes. The basic idea was to translate a system transfer function(math equation that defined its behavior) to an analog circuit design that has the same one but instead of having a physical input like force, speed you need to supply voltage, amps, and sinewaves. The result was an output that represented the real physical response.
Analog computing is huge in computational neuroscience. A whole subfield, neuromorphic computing, is dedicated to predicting how neurons behave using analog computers. The way neurons fire, have ion channels open, and signals propagate is all governed by differential equations, so doing that on a chip is much more efficient and quicker.
They are not governed by differential equations, they can be described by differential equations. Any system that changes can be modeled with differential equations.
one minute in of watching my first video by Matt Ferrell and I've already liked and subscribed. VERY NICE!!!
There is so much analogue in industy. Instumentation devices send 4-20mA signals and control valves are controlled by 4-20mA outputs sent by PLCs. I'm sure the cooling on data centres is controlled by analogue valves and PID.
Most electrical networks still use analogue measuring transformers and generator excitation and other complex feedbacks use PID. Full digitalisation has only occured in cyber and the internet.
Por lo que entiendo, la mayoría de las industrias usan PLC cuyos lenguajes de programación son discretos, como Ladder. Lo único analógico son las transmisiones en ciertos casos
Differential equations is just algebra with some calculus thrown into the mix.
You get to solve these types of equations a lot when designing the control systems for robots. Spring equations are especially annoying to solve because they oscillate and you need to predict that oscillation or your robot wont be able take two steps without falling over.
Hi Matt. I asked myself the same Analog or Digital question when I met with the Boeing Engineers in 1967. I had been hired and to go to USA because they believed me to be the guy with answers. They showed me their best analogue computer and then the fastest scientific digital computer available at that time (It was an IBM 360/65). I wrote a few algebraic/calculus equations (using Fourier analysis) based on the physics of the natural world plus a few mathematical modelling scenarios. I tried to run them on the analogue system first because the natural world we live in is not black or white with on or off. Our world is naturally a continuous analogue with a morphing of possibilities like Transitions in a Graphic as you shade from one color to another.
It turned out that one set of equations generated an even more complex set of equations that were beyond the understanding of even the best theoretical Mathematicians.
When I used Fortran (and Matrix theory like Upper Heisenberg) on a Digital machine I could at least see a way forward to modelling the actual physical world even if it was an approximation. By using "Quad precision/4 Word" and 32 decimal point floating point variables I could get close enough for the Boeing aircraft designers and Space scientists to get the answers they needed.
Digital Computing is an approximation and faster computing has meant we can get some kind of results faster. However they are not a true representation of physical reality and especially of how the neural networks within our Human minds work to solve physical problems.
So to come back to the question Analogue or Digital the answer is eventually Analogue but certainly not with the current elemental and chip based technology. The time it will will happen is when we develop "organic computing". There are some early signs of research into the cellular structure of Neurons and associated storage of information in simulated brain cells and memory. But we are at the early stages and I am personally doubtful whether it is a scientific journey we should take at this time of such cultural ignorance.
So as you ponder what is AI you should perhaps consider the nature of human intelligence at all and the glory of what it is to have a mind generations ahead and already infinitely more capable than any we are likely to fabricate in this so called "Modern world". Party on.
My first contacts with electronic analog computers were the Bosch fuel injection controls, first used by Volkswagen and Volvo, starting in 1967. Some of those cars are still in use. The computers worked pretty well. When they didn't, though, only dealers had ways to test and verify that the computer was/wasn't the issue. Woe visited the owner who owned a fried one. Careful jumper-cable use was an absolute must.
Analog makes the most sense for this sort of problem. To me, it's obvious. The way AI works is mixing values in simple ways that can all be done with analog, and you do not need decimal representations in the middle at all. You could have real time responses of unbelievable complexity, because you don't need to wait for clock cycles, just wave propagation, which is as fast as you could physically achieve
analog computers would have to be rebuilt for each different problem. they can't be reprogrammed they must be physically reconfigured. i don't see a way around that.
The author said a new problem is solved by the differential equation which expresses it. So each new problem only requires a new differential equation. So that is a wrong statement on his part?
yes but then you have to simulate and build the new differential equation with analog components.
The numbers are wrong. I checked the energy use on LAMA 2 which I run locally; less than 1w. Its more like comparing apples to oranges.
As undergrad student in 1968 in Sydney Uni I saw demo of analog computing. Target system was a pendulum, which was modeled in a hand built electrical analog computer. Output was a voltage. It was configured to run on same time scale as a physical pendulum. Both pendulum and very large voltmeter were set up side by side, and pendulum and analog computer set running at same time. Pendulum and voltmeter needle mirrored each other for perhaps a minute and we’re still doing so when lecturer shut them down.
Die deutsche Übersetzung ist zauberhaft. Mit leicht amerikanischen Akzent. Aber sehr gut zu verstehen! The German translation is magical. With a slight American accent. But very easy to understand!
when I've spoken about the physical limits in electronics chips in the 1999-2000 and we will surely looking for back with a mix of numerical & analog chips especially with the widest ''worlds'' of Artificial Intelligence and CryptoSystem ... -(Most were laughing, a very few of us unhappy to don't be in this new area of sciences & technologies)- ... And all the computing system will be reinvented for a new ways of programmation, coding & making softwares... As in computing buildings such as supercomputers to mainframes, as mini to micro computers and transputers (computer system with parallel CPU, that I discovered and loved to evolve in for the futur in Spatial universe -astrophysics & astronautics- in Algeria )...
It's wrong to think this will stop the evolution of the numerical systems in hardwares and softwares
Here we are now with the rethinking in as in quantum computing, the new discoveries with James WEBB telescope -that destroys a great number of theories and know-ledges -such, time evolution, limits and scale of the Universe, etc.-...
.
Good Luck for the new generation of researchers, discoverers, scientists & mathematicians!!!
I was waiting for you to talk about near lossless data storage for music & photography because of the recent analog chip breakthroughs. Storing analog on a chip will be a real game changer in those fields. Being able to record & copy perfect analog signals will make vinyl records, tape, & film redundant. That's going to have such a huge impact.
In audio field, you have just reinvented BBD (Bucket Brigade Delay). Having a non-mechanical memory still requires sampling and thus anti-aliasing filters. Even if audio quality is improved beyond what BBD offered, I don't see any quality benefit beyond digital storage. Digital audio is transparent already. Quantization you want to eliminate doesn't degrade quality, thanks to dithering. It may however be useful for more efficient near-lossless (not truly lossless because you can't make perfect copies in analog) storage.
If I may ask a follow up question: AFAIK a main constraint with analog computers is the lack of error correction capability, now in the context of ML I can imagine that since the answer is usually some sort of probability distribution or some statistical estimate, one could forgive some numerical errors, but is there an actual estimate or proof that this is the case? Else how are analog enthusiasts planning to deal with this?
Can anyone recommend channel(s) that cover similar topics with greater accuracy/expertise? Matt covers interesting topics, but I notice significant errors anytime he covers subjects that I truly understand, and I guess it’s just not reasonable to expect accuracy in extremely complex subjects from anyone who lacks a PhD in said subjects.
Sabine Hossenfelder actually knows what's she's talking about in her videos instead of mindlessly repeating hype. She mostly covers physics though
Might want to lookup advances in neuromorphic computing. Mixed-mode analog digital mimicking the human brain to solve problems. AI opportunities without massive data centers and huge memory requirements.
If mythic can deliver the same compute power of the gpu then why dose it not need a pcie slot? All of the pictures showed had connections with far fewer pins than a modern gpu, so I am wondering how they plan on getting the band width?
Thanks Matt, this a great followup to the Veritasium vid on Mythic
Saw a video about a battleship targeting computer from WWII. You’d set the distance, wind speed, etc. It would also consider the speed and tilt of the ship. It was 100% mechanical with all kinds of cams and linkages to calculate the proper firing solution.
Dang, what a neat topic! This is my favorite video of yours I've seen in a while. I'm excited to see what components of compute are better suited to analog hadware. The power savings sure are enticing for applications where one thing needs to do a very specific job for a long time.
This material deserves a previous one just explaining key concepts involving the final message. It is not easy to capture the essence of analog computers. Anyway, today I discovered your site. Good insightful material thanks for sharing this.
So which ascii key-combo do we press for the "delta" symbol? We're all going to need that for applying vectored deltas through logarithmic analysis and output
Before Digital Signal Processing was a thing, equalizing the frequency response of a room or car cabin was done with adjusting the gains/cuts of analog circuits called bandpass filters. That was the technology used for decades before adjusting numbers in Fourier algorithms in code uploaded to a chip to change the sound of your audio source, making the room or car sound more pleasant and balanced across the audio spectrum.
And in the mean time high speed digital hardware design is already very analog heavy. Balanced signalling, impedance matching, traces becoming antennas..
There are tremendous challenges with analog computers in two directions. First is noise and therefore challenges in scaling down the size of components. That means relatively analog computers are very large compared to digital ones for the same compute power. The point of solving things quickly is that the analog computer solves a specific problem. Our digital computers solve a large range of problems. Anyone who has worked in analog computing (and they were standard in control theory classes back in the day) know that they have to be reconfigured and scaled for each problem. So, yes theoretically they can take less energy than digital computers. So, can asynchronous computers (ones that don't run on a master clock). The problem is that they have other limitations. A potentially better use would be as a co-processor. Where a specific computation is managed by an analog engine under the control of a digital computer.
Imagine an analog computer AI assistant that was nothing but an analog microphone, an analog computer, and an analog speaker, but you could hold a conversation with it just like ChatGPT. That would be crazy.
I'm curious what the Rare Earth demands of Analog computers is vs their Digital counterparts? Though any reduction in Energy use is a net reduction in Rare Earth use because it means fewer solar panels are required.
Ah, analog computers are great new offer to save mankind from those sneaky bits.
I retired in 2K. I have heard this SAME pitch no less than four separate times during my working life.
Now we are getting it every 9 years or less. It sounds great. It isn't.
Why did the Bismark not be able to shoot down the painful slow British Swordfish torpedo planes? Analog computers.
True. They are not reprogrammable in the field. Never were. These aren't either. Yeah patch cords that are tiny, such an improvement.
As to Moore's suggestion from 20y ago, the industry blew right past his concern.
Watch this, don't share it with the analog evangelicals, not without a medical team nearby
th-cam.com/video/Gzkb3Zc8pGE/w-d-xo.html
As part of my Master thesis I analyze analog computers to determine weither they may be better to set a cognitive architecure, instead of digital computers. I find it really fascinating and do believe philosophical reflexions about the discrete/continuous structure of reality and human experience are key points in the machine consciousness field
Dallas OR and The Dalles OR are about 150 miles from each other.
One is rain forest and the other a desert, that's a pretty huge distinction when talking about water usage.
It doesn't matter anyway, the water is not destroyed.
For impact on the world of processors, more than analog computing I'm looking at backside power delivery which finally separates power circuits in a chip from computing circuits, and more advance designs like the Mill CPU (still coming after >10 years) which re-attempts VLIW and promises GPU/DSP chip efficiency brought to a general purpose instruction set.
I foresee a computer that will mix analogue, digital, and quantum computing in such a way as to play to each one's strengths. In particular, you could use the digital chip to calculate how to program the analogue chip. I have no idea if this would use less power than simply doing it with the digital chip, though. Perhaps you would lose power for a single calculation but gain it if the solution only had to be slightly modified for each of a large number of calculations.
Would it be possible to integrate analog computing into digital computing systems? I know there is some music hardware these days that uses digital control of analog parts, but I don't know if that's applicable to analog computing.
A wise artist once said;
“Welcome to cyber-space, I’m lost in a fog
Everything’s digital, I’m still analog”
-Joe Walsh
9:07 We will get down to 1 molecule, then smaller molecules and then integration begins. Like files are compressed. Smaller, less resistance allowing lower voltage.