Noise is by far the biggest issue with analog computing, it accumulates at every step and getting rid of it is very complicated. I would have loved to hear how they deal with it, but I guess they wanted to attract investors with buzzwords instead.
Same. I didn't fully realize how much of an issue the noise ratio was in practice until I read through some of the comments here. But I don't want to follow too much of what is already done until I create my own model.
@@bengsynthmusic Yeah, I actually may have overstepped my knowledge in a conversation on quantum with regards to noise the other day. I should go back and humbly apologize.
I love the certainty they said 30 billion units in 10ish years and the vagueness of the actual way it works "Yeah, a bit of analogue and a bit of digital". My degree is in robotics and mechatronics engineering so I am interested in this topic but learnt nothing from this video.
There's especially a few prosthetic arm/hand makers here on TH-cam who might be interested. Biological "analog signals" can be more than just heart beats. (especially since that beating isn't a direct signal itself, it's a signal from the brain transported down the vagus nerve)
Do you know how much this is gonna cost right now in a low production run, lol? Each chip that they could sell right now could be millions each when you take into account parts of the wafer with substandard performance. Youre going to have to wait 5-10y for scale to make it accessible to hobbyists. Im sure if youre a billionaire with a maker hobby, theyd be happy to sell you one right now.
We've had programmable analog arrays for a while now. They're similar to FPGAs, but with analog building blocks, and generally haven't seen much market adoption, since most mechanisms that need analog electronics can be made more cost-effective, efficient, and easy to produce by just building the analog circuits necessary as purpose-built systems. There are definitely interesting applications for field programmable analog arrays, but I think they're largely overestimating the general usefulness of it - understandable, since they want to make money!
That's how I've felt about it-great for edge computing, but people forget just how finicky analog is with different transistor responses, shielding for noise, voltage drop, etc, etc. It just isn't reliable for long and intricate calculations. A standard 64-bit floating point number has magnitudes more SNR than even a shielded wire.
Let's say they truly solved the finickiness of analog. They are still providing very focused processors that require physical upgrade. I think they captured their limitations well in the video, so there is something to be said for that... But if you wanna upgrade, replace the hardware. Dang, that's starting to sound like Apple.
@@QuintinKerby lol on the Apple point! To your main point though, I feel like having an attached FPGA would make much more of a difference than what analog could do for high-end computing. Analog really is finicky, but is great for more simple tasks due to energy efficiency (like edge computing for a simple neural network). The issue is from what I understand, it falls apart pretty fast when precision is required. Analog signal processing is kind like black magic...
I know of a consumer (Honestly really a prosumer/professional device but available to any consumer) device that actually utilizes this right now (and was released about 5 years ago), and it's not super popular. In the audio processing world there is a device called the McDSP Analog Processing Box. It has 1 FPGA that interfaces with the computer to program the analog arrays to mirror the processing that the McDSP suite of audio plugins did. (For those who aren't in the audio world, all of those plugins are actually modeled off of older existing analog devices.
Because he talked a lot about how analog computing is great (it is) but effectively nothing about why his company is great or how they get these outrageous efficiency claims. I can also create a good presentation with even better claims, but that will not fix the underlying issues analog computing has.@@jackytaly
Well, if you can't figure it out, you're not very good at analog circuit engineering. Most will lock mentally onto DC voltage offset, forgetting that analog signals are not stead state signals, so adjusting threshold levels to compensate would never be effective via, say a resistive ladder, while tossing in some wave shaping and capacitors could make all the difference. Or even using clipping. Wide palette to work with, depending upon the specific problem.
probably similar to gamma spectroscopy where you set a lower threshold value and then a max value, essentially creating a 'window', for a peak to enter in order to filter out noise. this is essentially how a discriminator works - eliminates outliers of too high or too low - and with an adjustable windows (or discriminator apertures) you can fine tune it to the signals you are interested in.
Programmable gains or attention it's simple to do. But the whole concept is just using people's lack of knowledge as they bang on about stuff you can easily do digitally and at extreme low power and get years from a battery... As his banging on about your tv remote draining is BS as they don't transmit until you push a button, IE all button connects power to the remotes chip. Beside if you get 6 months or 5 years out a tv remote battery is not a big deal.
This is typical of most of the 'stuff' on TH-cam. Ask an engineer to ask these guys questions and they will start with "What can it do" ok, fine, but "What can't it do" and now show me the sweet spot 'you think' will propagate.... then leave it up to creative folks to find other uses. Wish we had those kinds of videos.
It uses ohms law and KCL to measure the voltage drop over a known resistor value. It has a ton of the basic loops in a large matrix to perform a crazy amount of calculations per power vs a what a GPU can perform with our near limited transistor chips can functionally do now. The big problem with these chips are their inaccuracies due to noise, so it is hard to filter every single signal without converting it to digital, which defeats the entire purpose. I think these chips really do have a future in our technology, but in the same way Quantum computing is a crazy upgrade, these forms of computing won't take over digital chips, but instead the will add and bilateral efficiency to help where digital tends to struggle. In simpler words, these chips are suitable for very basic functions that work at a very high speed with a cheap power requirement, so machine learning is perfect for this as it is just a brute force method of programming.
Every time they say 'wake up a digital chip to do communication' you should know that they are selling snake oil. Yes everything in the real world is analog, and so is all communication including wifi and ethernet. The physical layer is always analog. If there was any substantial info in this video, I would have hoped for them to show a minimal sensor reading transmitted wirelessly (433MHz or similar). It doesn't have to be wifi. But that's a great buzzword that everyone knows... And just by repeating 'it will be more efficient' doesn't improve your claim at all. All the stuff about analog co-processors that wake up larger digital ones, e.g. voice activation, is already done with digital coprocessors.
I don't know enough to be able to call them out on any specifics but the whole way through the video a bunch of red flags were going off in the back of my mind, and the fact that at no point they explained how this thing works just makes me doubt that it actually does work at all. You can't simply say that you addressed the issue of analog signals degrading over time without any further insight on how, and what about the general noisiness of analog? I remember being taught that the problem with trying to do calculations on analog measurements is that the small errors accumulate into much larger errors very quickly, hence why we convert stuff into fixed (digital) numbers first, so this would only be useful for simple tasks like comparing input against predefined patterns and then it would need to call a digital system to do anything even remotely complex. I'd love to be wrong and see this actually taking over the world in a decade or so, it'd be really cool, but for now I'm gonna be cautiously skeptic about the whole deal.
In my opinion they did horrible job of explaining in general what there device even is and does, and how it works. All I heard was: jargon jargon, cool analog circuit, programmable, makes big change, believe us. I still have no idea how it works. Analog in -> magic -> ? Out . What functions can it exactly perform? is it Turing complete? Etc. Sound like bad investor pitch.
@@extended_e yeah it doesn't even hint at what it achieves or how, and the only demonstration is that it can recognize one specific sound... that's coming from a recording so it's always the exact same sound wave, not particularly impressive. I remember automatic answering machines doing the same back in the 80's or 90's. I'm not even sure if this thing is actually doing things in a truly analog way, the vagueness of it all makes me wonder if this isn't just a fancy dac, or worse, repurposed audio equipment. And nowadays whenever I hear someone saying "AI" and "machine learning" multiple times in a short presentation a "potential BS" alarm goes off in my brain.
Ones and zeros are really just electric pulses, so digital is basically analog. But no, the video does not go into detail of what it is talking about and therefor means nothing to the viewer who is trying to understand. I think the real question is, what energy efficient method is used to create the pulse wave information?
@@extended_e this is it... what function can it perform? i set a thermostat. it has to have a threshold, hysteresis. it controls a relay. it has a threshold, and hysteresis. i need some sort of ADC or quantatisation of signal levels to actually do anything with a thermostat and a relay... exchange relay for PWM or something... still needs quantitisation. i can have a mechanical thermostat, uses belows and levers. thats analog. but it still operates a digital switch. on. off. i could maybe throw in a resistor, make it drive a potentiometer... or maybe a saturable reactor, or a moveable core inductor... and have it vary, linearly, smoothly... but for what goal? it doesnt improve its efficiency, it makes it worse... the device doesnt need such complex controls. just on. and off. whereas if i want to produce a parabola on my lathe, i can either... make some linkages and pivots that follow the curve x=y^2... or i can convert it to CNC and wack digits into a computer. the linkages... thats analog. the result is analog. the puter? its output is analog but everything else was done digitally... depending on resolution... get right down and that "analog" surface is just small thresholds, levels, quanta... in my mind, "analog computing" in the electrical domain is just a distorting amplifier, the output simply being the input having been distorted in some manner, be it phase, rise time, or other things... in some scenarios, its useful... in others, its a waste of time.
As an analog design guy I know very well how hard is it to tune an ASIC for each user case, even with a lot of programmability it will require a lot of designers. Designers that are not available, as the training is hard.
@@aocastro for me it's an ASIC dedicated to voice recognition. The neat part is it come with more versatility as the recognition pattern can be programmed on the fly. Some kind of resonators and a programmable matrix weighted algorithms of some sort.
I've been intrigued by the potential of analog computing since I played around with classic analog computers in college. That said, I'm not so sure power savings for sensing will be the driving force in adoption of this new version of the technology. Perhaps the most promising opportunity is actual in AI inference engines which rely on comparison and relative closeness rather than absolutes.
Agreed. The power consumption of a decently built digital chip can be very low, almost immeasurable by some multimeters. See clocks or remotes that can already do past the expiration date of a battery. Often it's a battery type misuse, e.g. installing alkaline instead of an acid one. Alkaline batteries are designed for high current loads, not continuous loads. I think this is what can be improved if needed, neither analog nor digital.
I mean, we're already making digital circuits using lower-precision formats because the accuracy isn't that important. It's not obvious to me that analog will ever actually be better than a digital circuit of similar "close enough"ness. Anything that's not strictly reproducible makes debugging and validation *really* hard once problems leave toy-level scales.
@@pfeilspitzeis that true for the AI inference engines that the OP mentions though? At scale, nobody can debug or understand deep neural networks anyway. So long as the noise in the system doesn't multiply and accumulate to create insane outputs, this seems like a reasonable area of exploration
Analog never went away don't worry :-). Every tech company has someone who knows their way around analog filters, oscillators and control systems as a bare minimum. The market for discrete components is still a growing one. Digital is simply quite often a cheaper solution as it takes less time to develop and is highly configurable, but whenever things like cost for mass production, reliability, and longevity play a role, analog never went away. Some industrial PI(D) controllers, low noise power supplies, RF amplifiers, calibration equipment... In some cases only the interface is digital, the part that does the thinking is usually fully analog... I'm absolutely of the opinion most problems in electrical engineering can be solved with analog systems, as a hobby I enjoy designing circuitry to tackle these things. But working for a company where electronics design has to pay for the paycheck of me and my colleagues, you need to pick whatever is the cheapest and fastest to produce and a digital system is usually preferred if its nothing too critical. Digital systems are very robust as well, you can effectively store values as binary data rather than voltages and it is very resistant against electrical noise! With analog systems you're constantly thinking about shielding, ground loops, ground planes, crosstalk due to parallel wires or traces, manufacturing tolerances of components, power consumption, non-linearities, parasitic capacitance and inductance, trace resistance, bandwidths, temperature drift... Digital circuitry just works straight out of the box and is resistant to almost all these things. Analog technology is big in footprint too, you're not working with components that switch on or off, you work with components that are either optimized for linearity, tolerance, matching, or show very logarithmic behaviour, temperature stability, thus strictly speaking for active components, the surface area such a component needs to have is much bigger. The more accurate your computation has to be, the more parts you need to add to your topology to stay within your needed tolerances. We sometimes ovenize parts for stability, or slap a heating element right next to it (a resistor or a carefully "shorted" transistor) simply to fight these things. Digital systems behave the same over a very wide temperature range without losing accuracy, so in many applications digital systems may be much more power efficient. I think the rise of "analog computing" is mainly a terminology thing. Yes, an analog control system is actively real-time "computing" and solving complicated differential equations, but since it just inherently is meant to do that, you don't say it is computing, it is stabilizing the system. You don't refer to it as a computer but simply a controller. We don't call microcontrollers computers either anyways. Analog computing is nowadays used as a buzz word. Real analog computers ARE highly configurable instruments, but also very impractical and competed out of the market many decades ago for that reason. Besides this, education tailors their studies towards demand of research and companies. Digital systems have way more practical use cases, so when you go study electrical engineering, most of the courses will be centered around digital systems. The result of this is obviously that most people by percentage will specialize in digital electronics. This is not a bad thing, there's still many people whose first projects were building guitar amplifiers or radios and just sit through education in order to go back to analog stuff again. Analog and digital signals processing is an age-old established marriage, it was and still is a big field of research and education, resulting in an insanely large number of DACs and ADCs being pushed to the market to facilitate this. Although I am very passionate about analog electronics design myself, I think it is incredibly important to also understand it's limitations and impracticalities. There's a lot of benefits, but also a BIG set of impracticalities you do not put forward. Videos like these leave me very divided. At one hand I love seeing analog stuff be represented outside of the audiophile world in a community that is dominated with microcontrollers, at the other hand I feel like you are wearing prominent eye-flaps and are underestimating the amount of analog specialists still walking around today. There's literally conventions for this stuff being held everywhere, breakthroughs are still being made by companies pushing better and better analog components to the market. Analog will never leave the tech industry. For as long as the world we live in remains analog, the specialization will exist, and the amount of people working within it will grow or shrink depending on demand and practicality.
@@noduslabs The difference between both technologies is very simple actually. The word 'analog' comes from the fact that electronics were used to build models ANALOGOUS to real world systems. You can model most physical systems in an electric circuit, where the big benefit is you can speed up or slow down time as a constant and conveniently adjust parameters. People used these in the past for modeling things like a car suspension, the trajectory of a satellite, stiffness of a bicycle frame, modeling a car engine before production etc. There were highly configurable analog computers to program these systems in laboratories, but also non programmable ones in control systems theory (a d-jetronic engine control unit was an analog computer to control the timing in old car engines). This tech also extends in other domains like mechanical (the highly configurable differential analyzer for instance, the globus from the soyuz), fluids (moniac) etc. The word "digital" comes from your... digits. You can count fixed intervals on your hands. So anything with a stepped nature is at least partially digital. Digital computer systems operate using numbers in fixed increments and time in fixed increments. The higher the resolution, the better the accuracy. The benefit is that it's way easier to differentiate between a fixed set of high and a low signals compared to one signal that can be high, low and everything in-between. It's important to note digital does not mean directly it works with a binary system. The soviets successfully made powerful computers using a trinary system, the very first computers, both mechanical and electrical, mainly used the decimal system. Early computers for somewhat fixed tasks were fully mechanical (pinwheel calculators, curta, analytical machine etc.), highly configurable ones used electromechanical parts (zuse, facom), and later switched to fully electrical and solid state computers. I'm aware of one truly turing complete programmable mechanical computer to ever be super succesful commercially to be the Ascota 170 (I own one myself as well). It runs programs, as well, doing calculations using the decimal system too... despite it being digital and being an analog engineer myself, It's still the single most impressive piece of technology I've ever got to work on. I still can't comprehend how people managed to design such thing, not even mentioning how incredibly reliable they were. Obviously both technologies are used nowadays and there's a whole subset of parts specifically to do conversions from analog to digital and from digital to analog... Even digitally controlled analog computers have existed for a while, but they were so horribly expensive, they didn't stick around for very long. As for books to recommend... for me it is really hard to help you out with titles since most material I learned from Dutch and German textbooks. The rest of the knowledge came by simply repairing electronics, experiencing how other engineers solve problems and by doing a LOT of design work (preferably without simulations, but by actually building and measuring)... Internet is an alright place to start, but most electronics related content revolves around microcontrollers nowadays. Especially about analog systems, the golden years were from the early 1960s to the early 1980s or so (unless you master an Eastern European language, they continued developing analog stuff for an extra decade). Little advances were made discovering new components afterwards and the world shifted digital. Anything before relies on vacuum tubes and germanium semiconductors. Despite being very interesting and in rare cases actually useful, it won't serve you in the beginning of your adventure :-). For a general direction and to admire computing history I added a lot of search terms between brackets you can google. Obviously these are the famous systems from literature covered somewhat extensively on the internet. I recommend you to go to tech and computing museums to find way more interesting and obscure technologies. Many people came with wild and creative solutions to solve problems back in the days... From modeling a country's economy using water, making logic gates for high radiation environment that use high air pressure channels, to predicting the tides using a belt and pulleys...
Wow, you really offer a brilliant and concise analysis of analogue versus digital, and how well-developed the field actually is. Thank you for taking the time to do so! There were some other comments about how this was more marketing and investment-hunting than informational, but now I understand *why* it is!
I do believe analog is the right choice for things like machine learning, since it has a probabilistic foundation. But for practically any other case digital is far better; the rise of digital was mostly due to its help in discrete computing of hard numbers in banking and offices, that is never going away.
I won't ever buy a phone that's fully digital, nor a car. For me analog is a sensation, like opening a door or pressing a button. Without that feedback, digital is basically just wind.
@@lm4349well in electronics, digital and analog refer to very different things than what you just mentioned. Plus there is no way that a car could ever be fully digital, because all the sensors are sending out some voltage. This voltage is analog and needs to be converted to a digital signal before being interpreted by the on board computer. Even the input of a button needs to be converted. Voltage is not really something you can feel (unless it’s large enough, but don’t do that). What you can feel is the interface between you and the sensor.
A drawback to analog computers is that the elements have to be hand-wired for each type of calculation. Some form of digital switching would still be required to automatically set up an analog processor for each task.
The problem with analog is that you cannot daisy-chain processes like in digital. In digital there is no loss of data because a 1 is always a 1, a 0 is always a 0. In analog, a wave is never accurate. It's a close approximation at best, even if you correct for it like they're stating here. That's because even a single analog number holds a (literally) infinite amount of data; you simply cannot fully compensate for it. This means that, as you send the wave from device to device, the wave distorts more and more (pretty much exactly like how video degraded when copying from VHS tape to VHS tape in the 90's). Mind you, this doesn't make the tech inherently bad. It's definitely very useful. But it's really not as simple to use correctly as digital is. Pros and cons and all that. There are very good reasons digital replaced analog, and saying things like "rebuilding the world from scratch" is being markedly naive.
Not that I don't believe that analog could be more power efficient, but it would have helped if you did some back of the napkin calculations to illustrate by how much.
I watched the whole thing and heard a bunch of "this technology". What is it, though. How does it work? Is there a paper I can read that tells me how they are doing this? The power consumed by an Analog to Digital converter isn't significant enough to claim this much power savings. I had to search for details on their AML100 chip to figure out what they are talking about. Maybe adding a bit of technical detail or providing links for those interested would be a good idea in the future. Without that info it is difficult to decide whether this is pathbreaking technology or just another Turbo Encabulator.
It would fucking silly to go back to analog. I’m a huge analog synth nerd, and let me tell you: the reason why we love them so much compared to digital processing is because they are fucking silly. They decay with time. It needs to be repaired and taken care of forever. You have to let it cool down or heat up when you carry them around, because of the changes of temperature, just like an acoustic instrument. Would like that if you had to retune your cell phone just like a guitar everytime you wanted to sent a file That truly just feels like that con artist of a health company from few years ago that promised bibbidi bobbid percentage of better blood analyzing that would revolutionize everything. The real reason why digital replaced analog (and that seems to strangely be lacking from the explication of buddy) is because it is reliable, no matter the state of the system. You could send a gameboy to a bombardment in Afghan and have it eat a full on third degree burn, and that boy would still play your super Mario just fine. Even when a digital system is offset, it works great, because its all one and zero, aka a beep is sent or nothing happens. It can be referenced in time by the system, whereas that if some signal sent was in between 0 and 1, the system is still able to put it amongst the first pile or the nothing pile. Ditch that out of the window with analog; you would have to take extra care of everything. Your friend would sent you a word files, and it would literally crumble over time just like a sheet of paper. Your favorite pictures would fade overtime - an infinite array of pigments activated from a light source? that’s a polaroid for you-. No AI could help you or whatever for reference; he would either end up doing the same universal langage that digital does, but giving you thereafter the illusion that it was all yours, or either just skip that step entirely and just give you something different or degraded. Great
He said that their software fine-tunes the analog signal so that it's reliable digital bits & bytes. So, it's not analog computing. It's a better analog to digital conversion for digital computing.
It's analog computing as it performs transformative operation on analogic data, and can program a pipeline to perform those operations in a specific orders and manner. It's not a simple analog to digital conversion, which is fixed circuitry and can't be programmed upon at the analogic part of the signal.
@@anykeyh According to his exact words, they fine-tune the analog signal. For what purpose? For that matter, why even do this if you computing based on the analog signal? The answer is right there. What they're doing by calling it analog computing is muddying the waters. It's like calling LLMs artificial intelligence. They're not even remotely intelligent; they're really a complex filtering and composition tools.
1. I've been watching these developments for a while with rapt attention 2. I think the biggest barrier to entry for these devices is purely going to be economy of scale. Once we can get the full production and shipping cost below energy consumption savings for each application, these take off. Fast. And I can easily see a way for this to be the best way forward in so many places. Right now they're talking sensors, but they can also be incredibly useful for mathematical logic. What I feel like they're not talking about is how this is going to be a game changer for AI in the long run as well where thresholds change decisions. I feel like this is one of the final pieces in a particularly dangerous puzzle, though.
I feel like I'm missing something. I don't understand how energy savings could ever offset the cost of a device like this. Let's take an "expensive" inefficient industrial communication technology, 4-20 mA, as an example of something this technology could reasonably replace. A 4-20 mA device consumes about 2 kWh or less per YEAR. Yes, 4-20 mA is an analog technology, but most modern devices include microprocessors and can do simultaneous digital communication within a power budget of 4 mA, so it doesn't even cost any extra to add digital functionality to this kind of device. This new technology would stand to save maybe 50 cents a year in energy costs for this common application. I don't see how that would ever offset costs, and that's ignoring any questions about reliability or signal integrity, etc. I think this is cool and I'd love to see it gain traction, I just really doubt the whole energy savings argument.
@@Tom-xs7gpRunning chatGPT kosts around $0.03 per query. Precission of the numbers is not realy important for AI so they are allready quantizing models down to int4 or even less (Thats 4 bits and 16 values). If you could doe these calculation analog it would be even more efficient, so I think there is the most to win. And analog computing wont change everything it will just be used in some places (maybe even on a digital CPU) to handle some stuff faster. What will be funny is same AI models giving different answers because they are analog and some will take a differnt path.
@@kuro19382I don't know about OpenAI, but there are several quantized and scaled down LAMA LLM's you can test on your own computer. Search for webllm for one that runs some models in the browser on you GPU.
Let me think from the industry's point of view here, if they can produce the same result at a lower cost to them, it will be done. If the cost is the consumer's responsibility, this is the case with energy prices, the industry doesn't care. Semantics also has importance here, after the millions spent on marketing promoting "digital" as modern, superior, advanced, in relation to "analog" which today is seen as old, backward, inefficient, the analog will change its name as happens with UFO now it's UAP.
Digital has always been much easier to wrap your head around. Because it’s just on or off, there’s no gradient or continuum where the values can be anywhere along to extremes, such as on or off. That’s why to me the technology of the 70s, 80s, early 90s was such a marvel. The insane wizardry that was required to make those technologies work back then was just incredible. They were technical feats that required real out of the box thinking. Digital it all just becomes a matter of quantity. Such as adding more transistors into an increasingly smaller space. But now that we’re reaching the limits of our current materials, where, no longer getting the returns that we’d like. So going back to analog while using the current technologies and manufacturing processes that we’ve developed for digital electronics, will be able to eek out more out of the current materials that we use use to make chips without having to turn to more exotic materials.
"UFO now it's UAP" isn't the best example could've gone with. You wouldn't call the Aurora Borealis a "Flying Object", right? Signal flares, lasers (either originating from the ground, or from satellites), rainbows....none of them involve aircraft. So if the current term doesn't fit them, you need a more encompassing term.
Don't think that's about the bill, energy consumption changes how things are designed. If you were to implement a monitoring system in your home with hundreds of sensors, it's useless if all of them need battery change every other week, such design is never going to be implemented.
That's an interesting insight. In music production, "digital" typically means harsh, grating, and unmusical, while "analog" is synonymous with warm, lush, and full of natural harmonics. If you peruse Sweetwater and other vendors of musical products, just about every digital device boasts that it sounds more authentically analog than any other digital device ever made. In the same way, nearly all high-end guitar amplifiers still use vacuum tubes. I've tried several of the latest "emulated analog" digital amplifiers, and not one has come close to the tone and squishy responsiveness of my old tube amplifiers.
"you don't have to convert to digital" HOW "this way it is way more energy efficient" in general analog is NOT efficient, WHY should this be. then they switch to another clown and the new clown repeats those same sentences with a fee words changed, and the cycle repeats.
One big problem Natural noise effects results that means there could be UN wanted errors and bad presions But I'm also doing an DIY analog for about 3 weeks and it's precision is about at 1 pico amp and 1 pico voltage Which sadly is too low for me to easily work with But this is still powerful precision Let's hope this company does something better
other than photonic computing (which also have lot of challenges) i don't see analog computing for things we need precision (there actually usage like for low precision AI inference)
Of course, analogue computing isn't meant to replace digital. They have some specific applications, like in the case of training neural networks, as they can still work fine with a significant margin of error.
how is that different than let's say digital signals like wifi: if the device complies with part 15 of the FCC Rules. Operation is subject to the following two conditions: (1) this device may not cause harmful interference, and (2) this device must accept any interference received, including interference that may cause undesired operation. you must always check your inputs :P
@@iWhacko come on you know what I meant by natural Noise And you know the space radiation effects etc And you know that analog computers can calculate all digits instead of 1 at a time
@@LiberalEntrepreneur yeah I hope taking bunch of averages solves these because Advanced AI are effected slightly like 0.1 or something about cost of error(error rate cost)
And isn't that already in use in the phones? Like "Hey (S)iri" Passive sensor listens to the "S" frequency band and then wakes up the CPU from the low energy state?
It’s not a sensor. Sensors provide analog data, this takes in analog data and performs calculations therefore it is a computer. It is huge, because now we don’t need to convert the provided sensor analog data into digital data in order to compute it. It rejuvenates Moore’s law in a sense.
Great topic. I had a flash back to college learning AtoD conconverters🤣 What was old will become new again, and just because some works well does not mean it cant be improved. Cheers
In my curriculum (Bachelor in Industrial Information Systems), I was also required to interface sensors with automata. We also had "signal processing" because you can do miracles with the proper circuitry. Instead of trying to filter unwanted data in code, you can block the signal with RLC. Or, instead of trying to know if the signal has the same shape, you can have a delay line with a comparator. Analog is still well and alive! Tough, I forgot most of it, thanks to programming 😂
Early in the video you talked about the rate of demand for energy for computing outstripping the rate at which energy is produced (which I guess means something like the rate at which energy production is growing)? But analog computing would only have an impact on that problem if the use cases for the analog computers are the same use cases driving up energy consumption. Later in the video you discuss a number of examples of analog computing being used for sensors, and that discussion morphs very quickly into analog computers enabling NEW kinds of sensors. If analog computers, as power-efficient as they are, result in people's houses being full of novel kinds of sensor, all you've done is increase the amount of energy spent on computing AGAIN: you've created a tech niche for new sensing products. Analog computing only helps overall energy consumption if it replaces EG LLM artificial intelligence, blockchain computation etc - existing high-energy computing tasks.
Get the feeling this was a shill for the company making these. A lot of words were said to describe very little. Not a lot of explanation/background was given to it for a layman to understand its significance.
There is something important that is vaguely explained here. Digital is language, it need encode to transmit and decode to be processed. This is why digital the 'language computer' is easily configured for all purpose, it defines task by reading instruction, this is where much of the energy is consumed. Analog does not use language instead it is a scaled model of the world itself. Like one picture is worth thousand words, computation is much more efficient because that's nature work itself out, not an abstraction. I disagree with the argument of analog always saves power, in term of procedure yes, an analog do not have to run at gig hertz to constantly travel between registers. But being a scalar representation of reality, you cannot rule out some task has to depend on current, the most power consuming part of electrical physics. Also an important aspect of analog system signal need to be loud to resist noise and interference. Which mean signal use more power to boost amplitude. The remote control is also not a good example for analog application, the exact technicality is related to the watchdog timer, nothing to do with the functionality of the device, but the example can be misleading, because a remote control is user interface, UI is an unnatural entity that is very difficult to analog.
Sadly, This is all Fluff and no Stuff, with no real life use case comparisons of power and performance. Seems nothing but a repackaged cpu, that works with some custom analog sensors. It's more like the Fusion research, that never ends, but keeps getting closer to reality every other day. With energy harvesting technology advancement, the digital sensors, are already closer to, ultra low energy reality. Direct analog sensing and processing(digitally eventually), is more like reinventing a better wheel IMHO.♥️👍
Nice a 12 minute video with amazing graphics and audio quality about the achievements in analog with zero explanation of what has actually changed…. So… “WTF is analog computing”? Love the production guys! Let’s work on the reporting.
I've previously seen the idea of the opposite of an analogue computer waking up digital components. The idea was that it would be a standard digital computer, and it would have an "analogue card" that it would use for tasks that lend themselves to analogue computing, such as facial recognition.
As electronics engineer this is already done on embedded micros you put it to sleep leave the analogue to digital in a low power conversion and it will wake the CPU up on certain conditions (events) this whole video is just getting layman investors.
@@ntal5859 this pitch is total BS, but running an ADC in deep sleep is certainly not "analog computing" by any stretch. One common thing we find in any 10 cent MCU is OPA used as comparator to generate wakeup events, which is very different than ADC continuous acquisition.
I grew up with analog PID controllers in industrial and power plants. When I got my Engineering Degree I specialized in control technology and specifically analog controllers and relay logic (my professors were stunned at how fast I could design a relay system to control things as i had been figuring and troubleshooting them for years in the Navy and commercial power plants by the time I took those engineering classes). The problem with relays is the power they use and how bulky they are... but for some applications they are still the best as they are cheap if there only needs to be a few relays. I always thought that digital controllers for most industrial plant processes were overkill - and they sure are more time and cost expensive to fix when things go wrong. With analog you just replace the controller card with the right "dip switch" settings and adjust a variable resistor or variable capacitor to get it right (normally takes a few minutes). Then you repair the board by finding the component that has failed. I'm glad to see analog making a comeback. It sure uses a lot less power than digital computers.
Still don't get the computing part. What I get is they are only talking about analog input hardware. Where is the "computing" parts done in analog way? For example: how you sum 2 numbers? Join both input voltage in series?
Same thing with linear systems versus ( vs ) non-linear systems. In nature, the natural form of things/processes are inherently non-linear. Academia and others conveniently linearize systems to fit convenient calculations and mathematical frameworks.
Am I the only one that heard a whole bunch of nothing here? The video makes the claim that "a bunch of energy" is spent converting signals from analog to digital. While I'm almost entirely certain that that's false (heck, even low-power PICs and Atmels have built-in ADCs), they don't spend any time even attempting to defend that claim. They throw around the term "analog computing" but don't get into any specifics. Does their tech still use the Von Neumann architecture, or is it something else? How does it handle concepts like memory and data retrieval? Their demo showed (what I assume is) a neural network being run via their analog tech. That's great, but neural networks are the exact opposite of "easy to program by writing software", which was what they claimed their new technology had solved in the first place. It honestly just seems that they've created nothing more than... analog sensors that filter/detect things? Where's the analog computing and the software programmability the video began with? I'm honestly not sure what the hype is. Maybe we get a few more low-power passive sensors if their mysterious tech works, but that's about it?
We have plenty of microcontrollers that have analog input and outputs. Since we're already handling this kind of input and output, I don't understand what we gain by including this directly into a typical APU, or HPC APU. Seems like buzz about nothing.
The interview did not explain clearly how this "analog computing" works. I look up the details on Aspifinity and seems to me the chip functions as a low power signal threshold detector. Normal digital processing requires the sensor output signals to be digitalized before comparision which need computational power for the digitalization process. As for analog processing, the sensor signals can be splitted, transformed, and/or combined with a reference signal for threshold detection which requires zero digitalization power. The interesting part to me is that the chip must be able to generate a reference analog signal out of the embedded ML model for analog computation. Wonder what is the magic source?
There's lots of types of analog computing, for example, fluidonic computers use tubes, valves and reservoirs to direct fluids around a circuit and depending upon where the fluid winds up, you get different results. It's particularly useful for modeling the economy as you can set the variables and see where the money winds up pooling.
@@nicolasdujarrier The channel is Veritasium, the videos' titles are; "Future Computers Will Be Radically Different (Analog Computing)", and "The Most Powerful Computers You've Never Heard Of".
'nature analog computer' is not actually not 'analog' nor 'computer'. It is not 'digital'; that's right. But it's definitely 'atomic'. And it cannot 'count'. The problem here is 'why do we need such not a computer?'
Wake word detection and very basic neural networks are a perfect application for this as those don't really need all that high precision, yet require thousands of parallel calculations for a quicker response.
I do feel like in the rush to create a digital future, a lot of the strengths of mechanical systems or analog systems were forgotten. There are a lot of things out there that just really do not need to be digital... a good case in point is a vending machine or a soda fountain at a fast food place. In the past, we had these simple, mechanical vending machines where you put in money or scanned a card, and then pressed a button corresponding to the drink you wanted, or you held down a button to get what you wanted from the fountain, with maybe an LED indicating when a drink had run out with a simple analog sensor. These were replaced with devices that have incredibly laggy touchscreens and menus that you have to hunt through, and all this trouble of greying out disabled items if the drink has run out. It seems laggier and more energy intensive than the old system. The problem with digital and IoT seems to be, in a nutshell, that because we CAN put a digital computer in everything, that means we SHOULD put a digital computer in everything... and in all honesty, there were plenty of mechanical or analog systems that worked just as well or better than the digital systems that replaced them. There are things digital does better, sure, but do we really need a digital computer in everything? Does every device need a touchscreen instead of physical buttons just because a smartphone has one and it seems modern?
I think you're complaining about the wrong thing. Digital is *absolutely* better than trying to make a collection of cogs and levers and such that dispenses the thing. You can have buttons as inputs to digital things too -- see keyboards -- it's just a matter of the design people make for the vending machine. "The touchscreen is crappy" isn't a problem with digital; it's a problem with deciding to use a touch screen for no reason.
@@pfeilspitze By the way, you can build digital exactly with cogs and levers :) Digital does not mean "modern", and the quantum computers are more analog than they are digital.
@@getsideways7257 Sure, or with siphons, or any number of ways. And we don't, because doing it with transistors really is better. Mechanical digital calculators exist. They're incredibly finickity, limited, and expensive compared to grade-school-level throwaway calculators using microchips.
@@pfeilspitze Well, it's just a matter of perspective. If the civilization crumbles and we are left with crude tools only (or not so crude), it's not exactly easy to get back to the stage where grade school level calculators on microchips are "throwaways"...
Noise is present in all types of chips unfortunately but analog chips the speed is 2-3 orders of magnitude faster also memory is combined within the chip so foot print is smaller its busless computation so you dont have to read andwrite from separate location for AI tasks could be very energy efficient
There is a potential security issue in 8:32, when the analog processor could not distinguish a recorded glass breaking with the actual glass breaking event. It may generate false alarms and it is gonna be costly. I may be improved to workaround this.
A typical shittalker here. Every sound detection can be spoofed by playing that on a loudspeaker. Every single one. You may have some fidgety heuristics that you can use to detect "fake" glass breakage sounds, but they can be gamed aswell and you will have more false-negatives. The sound waves that are generated when you break glass are nearly identical to the ones that are generated when you play that sound on the loud speaker. Also what is the use case? I want to trick someones alarm system by playing a loud sound of glass breaking? How loud does it have to be to go through the glass that I want to break?
I'm half way in to the video ... and there is nothing substantial. No hard examples of what makes this fundamentally better than A2D conversion. How long has quantum computing been sold now, 24 years - and yet there still isn't even one real application. I think this is the same - too much hype.
I understand this type of video is supposed to dumb down the topic and use buzz words. However at the end of the day this is a repackaged DSP chip which we already is using since the 1980s.
Not convinced: for the past decade most of the analog circuitry in the vast majority of consumer gadgets has been replaced by eight pin microcontrollers.
ปีที่แล้ว +1
I agree, and this video is not very transparent about the details. Additionally, the nature of analogue components make them much more vulnerable to external interference like temperature changes and motion. Without a digital control step between sensors, noise and errors propagate naively through all components in the chain of calculation. This means you will likely go through debugging hell during development and maintenance hell once something is out in production. Analogue computations are suited for equations with a high number of dynamic parameters whose behavior and interaction can be replicated using analogue counterparts. A historical example of this is the prediction of tides with the position of celestial bodies as input parameters. Another is to aim free-falling bombs from a plane with inputs like the airplane's speed, altitude and the rotation of the earth. This begs the question whether what this video attempts to sell is aimed towards DIY consumers or enterprise level investors.
This is *not* analog computing; if it is, then it wasn't properly explained. Either way, true analog computing is Vaporware and will remain so. I'm old enough to have witnessed so many "disruptive inventions" that never materialized. And yes, prior to being a software developer, I had a degree in electronics. Analog at that.
Analog computing was quite a big thing in 60's and early 70's - both analog and digital computers were manufactured and had their fields of use where one or the other was better. At these times it was common to think that it will remain so in the future as well, ie. both these types of computers will be developed in parallel. But even in these early days of computing it was well understood that analog computing works well for some specialized applications (where analog computers were blazingly fast, totally outclassing their digital counterparts), mostly signal processing of any kind, numerical solving of differential equations (eg. for the purpose of various mechanical or other physical simulations) etc. - but it is not possible to have a fully analog universal, all-purpose computer. In other words, analog computer does not fulfill the definition of a Turing machine, while digital computer does. However, the main downside of the analog computers was that they needed to be programmed IN HARDWARE, ie. changing the program actually meant changing the layout of wires on the main switchboard and settings of knobs and switches that controlled operating parameters of specific modules (you may see these modules as equivalents of "functions" or "procedures" in digital software). Once digital computers became powerful enough, they practically obliterated analog computers because of ease of programming - as it is also said in this video, digital computers can be programmed in software, which was not possible for analog ones. And so we started to "digitize" everything, which is probably not the best way. Return to analog processing in *some* aspects - those where that kind of processing works well - and having the specialized analog part cooperate with general-purpose digital part (what was once called a "hybrid computer") may be a desirable computer architecture of the future, but of course this needs to be done properly.
Great stuff. As an industrial electronics designer, I have been wondering when this would happen. Some years ago when speech conversion was in it's infancy and digital devices were cumbersome at it, I designed an analogue converter that would create the initial phonetics and that would be passed to a micro to deal with the logic. I never built a prototype but it would have used a HUGE amount less processing power. My industrial designs are all a combination of digital and analogue. I have been involved in analogue electronics all my life so I'm very experienced in that. When I started programming micros a new world opened up some years ago. I see designers out there making software models of analogue that have big programs in DSP's when I would do it with op amps, logic chips and passive components. Much easier. Today, I design with whatever best suits the application. I end up with some interesting combinations of analogue and digital electronics.
I'm not sure how people aren't understanding what is being presented here. The comments are full of people saying "they didn't explain anything" and saying it's useless jargon lol. I get the benefits and understand this right away. It's essentially a way to have processes happen locally without everything having to be digital all the way through. When you add up the potential energy cost savings across the country or even the world, it's enormous. Lots of different devices could probably use analogue computing to complete a process with the help of a digital system somewhere along the line. Maybe credit card machines for example? Or gas pumps? Anything that does some kind of easy and repetitive task and uses a digital set up to do so. I record music and have accumulated a decent amount of knowledge with regards to translating analogue signals to digital ones. I would be curious to see how this could be used to decrease the latency of plugging analogue instruments into an audio interface to turn the analog signal into a digital one.
@@jedimindtrix2142people who dont understand it are people who understand computing and electronics. Yeah yeah cost efficient and stuff but how does it work? There is no explanation whatsoever about how it actually works on an electronic scale. It seems there is no concept but just marketing
You are the exact sort of gullible fool this video is aimed towards. He 100% did not explain the technology at all. Half of your comment makes no sense at all. " It's essentially a way to have processes happen locally without everything having to be digital all the way through" this sentence makes zero sense as it being a single process does not prevent it from being digital but separate nor does it save on any power. How on earth would a credit card machine or gas pump benefit from not being digital hen all the data they deal with is digital anyway? Again that makes absolutely no sense at all. The more repetitive and basic the task the easier it is to use digital. Analogue computing is only used when you need a waveform for perfect accuracy (such as with music), any other time digital is both more efficient and cheaper. This is also not a new technology like you infer, it is older than digital and came first. Analogue computers were made obsolete by digital in 99% of cases. @@jedimindtrix2142
@robinconnelly6079, There was a product called Girl Tech Password Journal in the 1990's that used this very technique. The password was a word or combination of words which would be converted to phonemes and stored. The device, when prompted, would listen for audio matching the phoneme wave patterns and unlock when the match was close. It wasn't a perfect system as the matching had a lot of error in it. I believe there was a switch to set the matching accuracy, basically some percentage of waveform mismatch allowed. It was a very clever device for its time.
The world already melds the analog and digital domain where it makes the most sense. Sensing the glass breaking is by no means a ground breaking function and easily implemented extremely cheaply in software. Making the front end of this device perform a couple of extra "analog" computing steps before being digitized and processed anyway seems unnecessarily trite. Sorry, "analog computing" will NOT supplant the digital domain in any serious way, not just by 2040, but ever. These units will make a nice peripheral piece to outfit a Raspberry Pi or Arduino experimental or R&D lab and no doubt, some real applications could be practical. However, these guys were large on hype and VERY shallow on exactly HOW and WHY it would be an improvement over status quo. The angle toward power savings is a stretch and immediately makes my BS-o-meter activate. My TV remote already lasts 2-3 years on 2 AA batteries.
This looks like just IOT stuff, and IOT did not really take off like everyone expected. By now we should have had IOT sensors in everything in our homes to monitor the total environment and to check for failures. This never happened.
About this video, I thought it was about some analog computing technology, but it.s more a discussion of what could be if that technology would exist Kind of dissapointed ...
The first time I saw an analog computer was an analog calculator. Changing where wires would plug into inputs, and it was able to accurately calculate and display certain graphs and such (don't remember specifically what they were). From the way they are going, it appears they are trying to implement analog with digital, which seems like an amazing idea. I look forward to seeing how well it optimizes power consumption in stuff in the future :D
Imagine creating something like the first arm processor that consumed so low energy that static on the air was enough to make it run but with this technology 🤯
Sooooo, we’re still digitally processing analog signals but with fewer layers of chips and lower power consumption (fewer layers between a sensor and a logical output)? Seems like a tweak on the existing model, but not a entirely different architecture. While not an engineer, I work with analog sensors in industrial controls (I am exactly the guy who needs to know when a bearing is going out before it actually does!), and this seems to fit under the IoT umbrella. I love the developments they are describing, but what is the “revolution”? - integration of traditional components into a single chip? Or is it the power management relative to maintaining signal processing?
(Are they talking about condensing microcontrollers with analog sensors over some common bus, but the bus is integrated into the microcontroller’s processor? “Chip Set” or “Chip”? Are we talking tiny PLCs in a common architecture? Or are we talking about platform for rapidly integrating analog sensors into custom hardware? All good, but I don’t know what they’re differentiating.)
Wow you managed to make an entire video about it and explained so little. How is it programmed? Can it do logical operations, can it do sequential computations, can it store memory? All i take is that it matches some patterns efficiently, that is it.
but isn't a program used to interpret these analog signals are digital too? and in the first-place digital conversion of some analog signals is necessary because computer programs talk and listens using digital signals.
I am not convinced that this is taking over the world, but sure sounds like it would make that guy's day, eh!? The explanation doesn't even make sense. Analog processing AND software programmable. At some point, software is going to need a 16 bit integer (or whatever) to work with to make decisions - that's A2D conversion. So where's the big win? Color me skeptical... next!
"Audio devices might become mostly analog" - you mean, like they all used to be until just a few decades ago? A walkman did not get more time out of a battery than an MP3 player without a display, btw. Also, good luck building a mostly analog OP1 :D
I've been dreaming about something like this for years. I remember a world before everything went digital, most devices were electronic (analog.) Much more responsive, much more stable devices. We did have digital, we did have computers, but even those, as I remember, were more responsive.
It’s an on board Fast Fourier Transform on a single chipset working independently of a central processor tuned for a specific wave or multiple waves. I don’t see the hoopla.
FFT is one of the application fields that is specifically very well suited for analog computing. Analog computers were always better at this than digital ones.
Engineer here, there's a reason why analog computing stopped being a thing in the 50s. Problems with residual voltage offsets, mains/AC interference, noise, harmonic distortion, hysterisis, and slew rate only get worse as you try and condense analog circuitry (or run at higher frequencies). The imprecisions and limitations of digital calculations are fixed- a single formula can give you the exact amount of error to expect from roundoff or truncation and you will get the exact same percentage every single time. If there's too much error, you can just use a larger and more accurate storage method (double-precision floating point, for example.) Analog circuitry requires complex nonlinear models to accurately predict/replicate error, and you can't just allocate extra memory to fix it. If this guy somehow managed to invent a super-material that allows large scale integration of analog components without these issues, I'd love to hear more about it.
Shill video. Analog computing will likely only make a comeback for applications that don't need accurate outputs. The dude's also wrong about iphones "transmitting in analog". A carrier wave does not an analog signal make. It's the "digital is analog with extra steps" argument. The less power hungry, more sensitive, the smaller analog computers are, the far less accurate. Anything that needs reliability needs to be digital, or digitized at the source. Analog sensors waking digital ones has been used as a power efficiency stopgap before. They work best when the sensor is powered by the signal it's sensing, and in that sense, we're already doing a lot of that. As for audio, you can take my DAC from my cold dead hands. I will always process audio in digital and convert to analog right before the speaker. Quality loss is a BITCH, and that's why even for "analog things", digital is still king.
Why would it save power? Raw analog signals don't have anywhere near the energy to power any half-way complex circuitry without amplification. And that amplification needs a power source, so there's no such thing as a "constantly off" analog sensor system, at least not one that's somehow superior to digital alternatives, where you can also gate the power source with a transistor and a latch on the sensor line....
🎯 Key Takeaways for quick navigation: 00:00 🌐 Analog processors, unlike digital ones, operate based on waves, offering a potential paradigm shift in computing. 00:57 ⚡️ Analog processors promise AI and machine learning capabilities with significantly lower energy consumption, about 1/1000th of their digital counterparts. 03:54 🔄 Software-programmable analog processors can interpret raw analog signals without the need for conversion into digital, a transformative breakthrough in computing. 05:53 📈 The integration of analog computing in devices is expected to reach 30 billion by 2040, potentially revolutionizing how computing energy is consumed and deployed. 06:48 🔋 A combination of analog and digital processing could optimize energy usage, allowing for continuous monitoring with minimal power consumption and extended battery life. 09:19 💡 Analog systems enable more efficient monitoring and insights, particularly in industrial settings, by utilizing minimal power for extended periods and improving resource allocation. 10:47 💓 Analog computing excels in applications like heart monitoring, utilizing power-efficient sensors to detect anomalies and trigger appropriate actions, showcasing its potential in healthcare. Made with HARPA AI
Analog computing was used in military a lot. I like to use as a great example, an automated target tracking for tanks while in movement. The idea was to always have your gun on target, while riding on uneven terrain. I saw an example of Yugoslavian tanks T 84, it had ultrafast analog computing , efficient, much faster than digital of that time, I believe system was developed in the 70s and used in 84.
Sadly this video contains so many incorrect statements, that it just hurts. For example the energy required for converting analog input data to digital is almost non-existent. Take your average game controller for example - it is connected via the digital USB connector. However, every modern game pad has an "analog stick" which reads it's position in voltage (two values, one for x and one for y axis). So does this require "computing" or drain your controller? No of course it does not - almost all analog to digital conversion is done by fixed circuitry working on purely electrical principles which have nothing to do with computing at all. In fact they are nothing but "voltage translators" which encode the digital signal in an analog way. There is no processing involved. Also this video claims most analog sensor data (at an example of the smartphone) were converted from analog to digital and then back for presentation to the user. This also is wrong, because almost all sensor data is never converted back, but instead used as input for some graphical user interface. Therefore you have to make the data digital at one point, because this is the only language LCDs and other screens can understand. Why? Because a pixel can either be on or off, and the light value has a discrete fixed range. There are no "analog screens". Thus you will always display your sensor data in a digital representation using an application on a given screen. Another wrong claim is that using this technology (analog computes) we could finally reduce the energy consumption of specific tasks (e.g. voice command recognition) by doing this processing analog. Sorry guy but I have to break it to you - electrical engineers are a very smart bunch - so while we software developers mostly have no idea what they are talking about, they do understand our needs and this is why this command recognition is ALREADY analog with some specific on-board circuitry. Using an analog computer instead of an analog circuit will in fact INCREASE the energy required for this task. At best an analog computer could be used like an FPGA (like a programmable circuit). But an FPGA will still require less energy - it just has to be "programmed" first. It's too frustrating to even go into all the incorrect statements made here. But long story short, there is a reason why no one cares for analog computers - and no one ever will. It is a niche product for a niche use case. But it can never replace digital computing in the broad sense.
Noise is by far the biggest issue with analog computing, it accumulates at every step and getting rid of it is very complicated. I would have loved to hear how they deal with it, but I guess they wanted to attract investors with buzzwords instead.
Spot on
Same. I didn't fully realize how much of an issue the noise ratio was in practice until I read through some of the comments here. But I don't want to follow too much of what is already done until I create my own model.
but that's already getting solved
Same problem in quantum computers.
@@bengsynthmusic Yeah, I actually may have overstepped my knowledge in a conversation on quantum with regards to noise the other day. I should go back and humbly apologize.
I love the certainty they said 30 billion units in 10ish years and the vagueness of the actual way it works "Yeah, a bit of analogue and a bit of digital". My degree is in robotics and mechatronics engineering so I am interested in this topic but learnt nothing from this video.
Electronics tech here, seems like an infomercial... sounds good, where's the real tech?
I think it's because they don't really present any new info, Engineer, now no longer in the industry, but huge fan of analog computing since the 80s.
no one comes to 'freethink' to actually learn anything, or (in fact) think freely.
@@brapamaldi Aha, i haven't seen any of this channel until now and i'm quite unimpressed.
Dude, its atmega + ADC in SOC form factor. Where did you get your degree?
Great, now make it available to the hobbyists so we can start exploring the possibilities.
my thoughts exactly
There's especially a few prosthetic arm/hand makers here on TH-cam who might be interested. Biological "analog signals" can be more than just heart beats. (especially since that beating isn't a direct signal itself, it's a signal from the brain transported down the vagus nerve)
Yes Please
Do you know how much this is gonna cost right now in a low production run, lol? Each chip that they could sell right now could be millions each when you take into account parts of the wafer with substandard performance. Youre going to have to wait 5-10y for scale to make it accessible to hobbyists. Im sure if youre a billionaire with a maker hobby, theyd be happy to sell you one right now.
Love to see some early researchers protect this stuff with a GPL so it'll always be in the hands of the people.
We've had programmable analog arrays for a while now. They're similar to FPGAs, but with analog building blocks, and generally haven't seen much market adoption, since most mechanisms that need analog electronics can be made more cost-effective, efficient, and easy to produce by just building the analog circuits necessary as purpose-built systems. There are definitely interesting applications for field programmable analog arrays, but I think they're largely overestimating the general usefulness of it - understandable, since they want to make money!
That's how I've felt about it-great for edge computing, but people forget just how finicky analog is with different transistor responses, shielding for noise, voltage drop, etc, etc. It just isn't reliable for long and intricate calculations. A standard 64-bit floating point number has magnitudes more SNR than even a shielded wire.
Let's say they truly solved the finickiness of analog. They are still providing very focused processors that require physical upgrade. I think they captured their limitations well in the video, so there is something to be said for that... But if you wanna upgrade, replace the hardware. Dang, that's starting to sound like Apple.
@@QuintinKerby lol on the Apple point!
To your main point though, I feel like having an attached FPGA would make much more of a difference than what analog could do for high-end computing. Analog really is finicky, but is great for more simple tasks due to energy efficiency (like edge computing for a simple neural network). The issue is from what I understand, it falls apart pretty fast when precision is required. Analog signal processing is kind like black magic...
I know of a consumer (Honestly really a prosumer/professional device but available to any consumer) device that actually utilizes this right now (and was released about 5 years ago), and it's not super popular. In the audio processing world there is a device called the McDSP Analog Processing Box. It has 1 FPGA that interfaces with the computer to program the analog arrays to mirror the processing that the McDSP suite of audio plugins did. (For those who aren't in the audio world, all of those plugins are actually modeled off of older existing analog devices.
AI my friend. Thats all about AI
this is yet another proof of "you can sell out 'nothing' to majority using good presentation"
why do you say that?
@@jackytaly its so easy to fool non technical person using bunch of technical jargons in a professional looking explanation
Because he talked a lot about how analog computing is great (it is) but effectively nothing about why his company is great or how they get these outrageous efficiency claims. I can also create a good presentation with even better claims, but that will not fix the underlying issues analog computing has.@@jackytaly
Ueah this video explained nothing. It is unclear what are they even Selling. A programable filtering and combination circuit?
@@extended_e
Energy efficient circuitry.
I want my 11.75 minutes back. Far too much "let me sell you a dream" than actual information in this video.
11.85 minutes. You have a time offset problem. 😄
@@AmateurHistorian999 His phone processor has two much silicon in it.😂
Yeah look at the body language, it speaks volumes
welcome to jewtube your first time?
Tell me you don’t have a million TH-cam subscribers, without telling me your level of insecurities ¯\__(°_o)__/¯
"We have solved the voltage offset problem, but we won't tell you anything on how we do it" Great. That was the the most important thing to know.
Well, if you can't figure it out, you're not very good at analog circuit engineering.
Most will lock mentally onto DC voltage offset, forgetting that analog signals are not stead state signals, so adjusting threshold levels to compensate would never be effective via, say a resistive ladder, while tossing in some wave shaping and capacitors could make all the difference. Or even using clipping.
Wide palette to work with, depending upon the specific problem.
@@spvillano Wouldn't some sort of self-adjusting rolling average also work? idk
@@br45entei it should. Although for slowly changing measurements like astronomical data, that might unintentionally obliterate the data.
probably similar to gamma spectroscopy where you set a lower threshold value and then a max value, essentially creating a 'window', for a peak to enter in order to filter out noise. this is essentially how a discriminator works - eliminates outliers of too high or too low - and with an adjustable windows (or discriminator apertures) you can fine tune it to the signals you are interested in.
Programmable gains or attention it's simple to do. But the whole concept is just using people's lack of knowledge as they bang on about stuff you can easily do digitally and at extreme low power and get years from a battery... As his banging on about your tv remote draining is BS as they don't transmit until you push a button, IE all button connects power to the remotes chip. Beside if you get 6 months or 5 years out a tv remote battery is not a big deal.
Okay, but how does it work? This is just a long marketing video
Looks like it does not
They looking for cleaver people like you to figure that out then they market it.
This is typical of most of the 'stuff' on TH-cam. Ask an engineer to ask these guys questions and they will start with "What can it do" ok, fine, but "What can't it do" and now show me the sweet spot 'you think' will propagate.... then leave it up to creative folks to find other uses. Wish we had those kinds of videos.
It uses ohms law and KCL to measure the voltage drop over a known resistor value. It has a ton of the basic loops in a large matrix to perform a crazy amount of calculations per power vs a what a GPU can perform with our near limited transistor chips can functionally do now. The big problem with these chips are their inaccuracies due to noise, so it is hard to filter every single signal without converting it to digital, which defeats the entire purpose. I think these chips really do have a future in our technology, but in the same way Quantum computing is a crazy upgrade, these forms of computing won't take over digital chips, but instead the will add and bilateral efficiency to help where digital tends to struggle. In simpler words, these chips are suitable for very basic functions that work at a very high speed with a cheap power requirement, so machine learning is perfect for this as it is just a brute force method of programming.
Every time they say 'wake up a digital chip to do communication' you should know that they are selling snake oil.
Yes everything in the real world is analog, and so is all communication including wifi and ethernet. The physical layer is always analog.
If there was any substantial info in this video, I would have hoped for them to show a minimal sensor reading transmitted wirelessly (433MHz or similar). It doesn't have to be wifi. But that's a great buzzword that everyone knows...
And just by repeating 'it will be more efficient' doesn't improve your claim at all.
All the stuff about analog co-processors that wake up larger digital ones, e.g. voice activation, is already done with digital coprocessors.
I don't know enough to be able to call them out on any specifics but the whole way through the video a bunch of red flags were going off in the back of my mind, and the fact that at no point they explained how this thing works just makes me doubt that it actually does work at all.
You can't simply say that you addressed the issue of analog signals degrading over time without any further insight on how, and what about the general noisiness of analog? I remember being taught that the problem with trying to do calculations on analog measurements is that the small errors accumulate into much larger errors very quickly, hence why we convert stuff into fixed (digital) numbers first, so this would only be useful for simple tasks like comparing input against predefined patterns and then it would need to call a digital system to do anything even remotely complex.
I'd love to be wrong and see this actually taking over the world in a decade or so, it'd be really cool, but for now I'm gonna be cautiously skeptic about the whole deal.
In my opinion they did horrible job of explaining in general what there device even is and does, and how it works. All I heard was: jargon jargon, cool analog circuit, programmable, makes big change, believe us. I still have no idea how it works. Analog in -> magic -> ? Out . What functions can it exactly perform? is it Turing complete? Etc. Sound like bad investor pitch.
@@extended_e yeah it doesn't even hint at what it achieves or how, and the only demonstration is that it can recognize one specific sound... that's coming from a recording so it's always the exact same sound wave, not particularly impressive. I remember automatic answering machines doing the same back in the 80's or 90's.
I'm not even sure if this thing is actually doing things in a truly analog way, the vagueness of it all makes me wonder if this isn't just a fancy dac, or worse, repurposed audio equipment. And nowadays whenever I hear someone saying "AI" and "machine learning" multiple times in a short presentation a "potential BS" alarm goes off in my brain.
Ones and zeros are really just electric pulses, so digital is basically analog. But no, the video does not go into detail of what it is talking about and therefor means nothing to the viewer who is trying to understand. I think the real question is, what energy efficient method is used to create the pulse wave information?
@@extended_e this is it... what function can it perform?
i set a thermostat. it has to have a threshold, hysteresis. it controls a relay. it has a threshold, and hysteresis. i need some sort of ADC or quantatisation of signal levels to actually do anything with a thermostat and a relay...
exchange relay for PWM or something... still needs quantitisation.
i can have a mechanical thermostat, uses belows and levers. thats analog. but it still operates a digital switch. on. off. i could maybe throw in a resistor, make it drive a potentiometer... or maybe a saturable reactor, or a moveable core inductor... and have it vary, linearly, smoothly... but for what goal? it doesnt improve its efficiency, it makes it worse... the device doesnt need such complex controls. just on. and off.
whereas if i want to produce a parabola on my lathe, i can either... make some linkages and pivots that follow the curve x=y^2... or i can convert it to CNC and wack digits into a computer.
the linkages... thats analog. the result is analog.
the puter? its output is analog but everything else was done digitally... depending on resolution... get right down and that "analog" surface is just small thresholds, levels, quanta...
in my mind, "analog computing" in the electrical domain is just a distorting amplifier, the output simply being the input having been distorted in some manner, be it phase, rise time, or other things... in some scenarios, its useful... in others, its a waste of time.
I'm not an expert either but I was hoping they would say it wasn't hackable but I'm not sure that is the case
As an analog design guy I know very well how hard is it to tune an ASIC for each user case, even with a lot of programmability it will require a lot of designers. Designers that are not available, as the training is hard.
Is AI able to help out in any way?
@@redone823 rigth now no (I've tried rapidly). Maybe in the future, but there is so many variables, hard to predict.
I might be well wrong, but this analog computing thing just sounds like designing specialized hardware, doesn't it? if so, this is absolutely not new.
@@aocastro for me it's an ASIC dedicated to voice recognition. The neat part is it come with more versatility as the recognition pattern can be programmed on the fly. Some kind of resonators and a programmable matrix weighted algorithms of some sort.
Only god can help me pass this course in college this semester
I've been intrigued by the potential of analog computing since I played around with classic analog computers in college. That said, I'm not so sure power savings for sensing will be the driving force in adoption of this new version of the technology. Perhaps the most promising opportunity is actual in AI inference engines which rely on comparison and relative closeness rather than absolutes.
Underrated comment right here
Agreed. The power consumption of a decently built digital chip can be very low, almost immeasurable by some multimeters. See clocks or remotes that can already do past the expiration date of a battery. Often it's a battery type misuse, e.g. installing alkaline instead of an acid one. Alkaline batteries are designed for high current loads, not continuous loads. I think this is what can be improved if needed, neither analog nor digital.
I mean, we're already making digital circuits using lower-precision formats because the accuracy isn't that important. It's not obvious to me that analog will ever actually be better than a digital circuit of similar "close enough"ness.
Anything that's not strictly reproducible makes debugging and validation *really* hard once problems leave toy-level scales.
What you say makes more sense to me than the whole video. Now I think I sorta understand the difference. I would agree with you.
@@pfeilspitzeis that true for the AI inference engines that the OP mentions though? At scale, nobody can debug or understand deep neural networks anyway. So long as the noise in the system doesn't multiply and accumulate to create insane outputs, this seems like a reasonable area of exploration
Analog never went away don't worry :-). Every tech company has someone who knows their way around analog filters, oscillators and control systems as a bare minimum. The market for discrete components is still a growing one. Digital is simply quite often a cheaper solution as it takes less time to develop and is highly configurable, but whenever things like cost for mass production, reliability, and longevity play a role, analog never went away. Some industrial PI(D) controllers, low noise power supplies, RF amplifiers, calibration equipment... In some cases only the interface is digital, the part that does the thinking is usually fully analog...
I'm absolutely of the opinion most problems in electrical engineering can be solved with analog systems, as a hobby I enjoy designing circuitry to tackle these things. But working for a company where electronics design has to pay for the paycheck of me and my colleagues, you need to pick whatever is the cheapest and fastest to produce and a digital system is usually preferred if its nothing too critical. Digital systems are very robust as well, you can effectively store values as binary data rather than voltages and it is very resistant against electrical noise! With analog systems you're constantly thinking about shielding, ground loops, ground planes, crosstalk due to parallel wires or traces, manufacturing tolerances of components, power consumption, non-linearities, parasitic capacitance and inductance, trace resistance, bandwidths, temperature drift... Digital circuitry just works straight out of the box and is resistant to almost all these things.
Analog technology is big in footprint too, you're not working with components that switch on or off, you work with components that are either optimized for linearity, tolerance, matching, or show very logarithmic behaviour, temperature stability, thus strictly speaking for active components, the surface area such a component needs to have is much bigger. The more accurate your computation has to be, the more parts you need to add to your topology to stay within your needed tolerances. We sometimes ovenize parts for stability, or slap a heating element right next to it (a resistor or a carefully "shorted" transistor) simply to fight these things. Digital systems behave the same over a very wide temperature range without losing accuracy, so in many applications digital systems may be much more power efficient.
I think the rise of "analog computing" is mainly a terminology thing. Yes, an analog control system is actively real-time "computing" and solving complicated differential equations, but since it just inherently is meant to do that, you don't say it is computing, it is stabilizing the system. You don't refer to it as a computer but simply a controller. We don't call microcontrollers computers either anyways. Analog computing is nowadays used as a buzz word. Real analog computers ARE highly configurable instruments, but also very impractical and competed out of the market many decades ago for that reason.
Besides this, education tailors their studies towards demand of research and companies. Digital systems have way more practical use cases, so when you go study electrical engineering, most of the courses will be centered around digital systems. The result of this is obviously that most people by percentage will specialize in digital electronics. This is not a bad thing, there's still many people whose first projects were building guitar amplifiers or radios and just sit through education in order to go back to analog stuff again.
Analog and digital signals processing is an age-old established marriage, it was and still is a big field of research and education, resulting in an insanely large number of DACs and ADCs being pushed to the market to facilitate this.
Although I am very passionate about analog electronics design myself, I think it is incredibly important to also understand it's limitations and impracticalities. There's a lot of benefits, but also a BIG set of impracticalities you do not put forward. Videos like these leave me very divided. At one hand I love seeing analog stuff be represented outside of the audiophile world in a community that is dominated with microcontrollers, at the other hand I feel like you are wearing prominent eye-flaps and are underestimating the amount of analog specialists still walking around today. There's literally conventions for this stuff being held everywhere, breakthroughs are still being made by companies pushing better and better analog components to the market.
Analog will never leave the tech industry. For as long as the world we live in remains analog, the specialization will exist, and the amount of people working within it will grow or shrink depending on demand and practicality.
This comment was more informative than the video
"still walking around today" thank you for stayin' alive. It's one of the coolest, yet least appreciated fields of human intelligence.
Great response! Do you have any video or text you could recommend on learning the difference between the digital and analogue? Thanks!
@@noduslabs The difference between both technologies is very simple actually.
The word 'analog' comes from the fact that electronics were used to build models ANALOGOUS to real world systems. You can model most physical systems in an electric circuit, where the big benefit is you can speed up or slow down time as a constant and conveniently adjust parameters. People used these in the past for modeling things like a car suspension, the trajectory of a satellite, stiffness of a bicycle frame, modeling a car engine before production etc. There were highly configurable analog computers to program these systems in laboratories, but also non programmable ones in control systems theory (a d-jetronic engine control unit was an analog computer to control the timing in old car engines). This tech also extends in other domains like mechanical (the highly configurable differential analyzer for instance, the globus from the soyuz), fluids (moniac) etc.
The word "digital" comes from your... digits. You can count fixed intervals on your hands. So anything with a stepped nature is at least partially digital. Digital computer systems operate using numbers in fixed increments and time in fixed increments. The higher the resolution, the better the accuracy. The benefit is that it's way easier to differentiate between a fixed set of high and a low signals compared to one signal that can be high, low and everything in-between. It's important to note digital does not mean directly it works with a binary system. The soviets successfully made powerful computers using a trinary system, the very first computers, both mechanical and electrical, mainly used the decimal system. Early computers for somewhat fixed tasks were fully mechanical (pinwheel calculators, curta, analytical machine etc.), highly configurable ones used electromechanical parts (zuse, facom), and later switched to fully electrical and solid state computers. I'm aware of one truly turing complete programmable mechanical computer to ever be super succesful commercially to be the Ascota 170 (I own one myself as well). It runs programs, as well, doing calculations using the decimal system too... despite it being digital and being an analog engineer myself, It's still the single most impressive piece of technology I've ever got to work on. I still can't comprehend how people managed to design such thing, not even mentioning how incredibly reliable they were.
Obviously both technologies are used nowadays and there's a whole subset of parts specifically to do conversions from analog to digital and from digital to analog... Even digitally controlled analog computers have existed for a while, but they were so horribly expensive, they didn't stick around for very long.
As for books to recommend... for me it is really hard to help you out with titles since most material I learned from Dutch and German textbooks. The rest of the knowledge came by simply repairing electronics, experiencing how other engineers solve problems and by doing a LOT of design work (preferably without simulations, but by actually building and measuring)... Internet is an alright place to start, but most electronics related content revolves around microcontrollers nowadays. Especially about analog systems, the golden years were from the early 1960s to the early 1980s or so (unless you master an Eastern European language, they continued developing analog stuff for an extra decade). Little advances were made discovering new components afterwards and the world shifted digital. Anything before relies on vacuum tubes and germanium semiconductors. Despite being very interesting and in rare cases actually useful, it won't serve you in the beginning of your adventure :-).
For a general direction and to admire computing history I added a lot of search terms between brackets you can google. Obviously these are the famous systems from literature covered somewhat extensively on the internet. I recommend you to go to tech and computing museums to find way more interesting and obscure technologies. Many people came with wild and creative solutions to solve problems back in the days... From modeling a country's economy using water, making logic gates for high radiation environment that use high air pressure channels, to predicting the tides using a belt and pulleys...
Wow, you really offer a brilliant and concise analysis of analogue versus digital, and how well-developed the field actually is. Thank you for taking the time to do so! There were some other comments about how this was more marketing and investment-hunting than informational, but now I understand *why* it is!
I do believe analog is the right choice for things like machine learning, since it has a probabilistic foundation. But for practically any other case digital is far better; the rise of digital was mostly due to its help in discrete computing of hard numbers in banking and offices, that is never going away.
I won't ever buy a phone that's fully digital, nor a car.
For me analog is a sensation, like opening a door or pressing a button. Without that feedback, digital is basically just wind.
@@lm4349well in electronics, digital and analog refer to very different things than what you just mentioned.
Plus there is no way that a car could ever be fully digital, because all the sensors are sending out some voltage.
This voltage is analog and needs to be converted to a digital signal before being interpreted by the on board computer. Even the input of a button needs to be converted.
Voltage is not really something you can feel (unless it’s large enough, but don’t do that). What you can feel is the interface between you and the sensor.
When banks go analog, they will no longer show your account balance in digits, but using an analog VU meter. :-)
A drawback to analog computers is that the elements have to be hand-wired for each type of calculation. Some form of digital switching would still be required to automatically set up an analog processor for each task.
They literally said in the video that digital isn't going away and that digital and analog will work together.
The problem with analog is that you cannot daisy-chain processes like in digital. In digital there is no loss of data because a 1 is always a 1, a 0 is always a 0. In analog, a wave is never accurate. It's a close approximation at best, even if you correct for it like they're stating here. That's because even a single analog number holds a (literally) infinite amount of data; you simply cannot fully compensate for it. This means that, as you send the wave from device to device, the wave distorts more and more (pretty much exactly like how video degraded when copying from VHS tape to VHS tape in the 90's). Mind you, this doesn't make the tech inherently bad. It's definitely very useful. But it's really not as simple to use correctly as digital is. Pros and cons and all that. There are very good reasons digital replaced analog, and saying things like "rebuilding the world from scratch" is being markedly naive.
The board, the chip, the battery. Looks normal. Speaking guy, CEO? maybe overselling the revolution??
I instantly don't trust anyone who does the Mr Burns "excellent" finger tips touch, while talking.
CEO: "AI and machine-learning algorithms"
Investors: Shut up and take our money!
Not that I don't believe that analog could be more power efficient, but it would have helped if you did some back of the napkin calculations to illustrate by how much.
By 1000 to a million times. In the future it could be possible to run a chat gpt like model on a desktop.
@@kushalvora7682 Ok, but that's just hand waving. How did you calculate these numbers?
veritasium made video about 'mythic' company thatnis making analog computers and thay stated 23 tflops (matrix multiplication) using 3w of power.
Ah and thay are not flops anymore bsc we are working in analog domain so 23tops
@@kushalvora7682you can already :)
You know what's less energy intensive than a voice activated kitchen tap
A hand operated one
I watched the whole thing and heard a bunch of "this technology". What is it, though. How does it work? Is there a paper I can read that tells me how they are doing this? The power consumed by an Analog to Digital converter isn't significant enough to claim this much power savings. I had to search for details on their AML100 chip to figure out what they are talking about. Maybe adding a bit of technical detail or providing links for those interested would be a good idea in the future. Without that info it is difficult to decide whether this is pathbreaking technology or just another Turbo Encabulator.
Hey, don't knock the Turbo Encabulator. Between that and my Interocitor (which I built myself!) I'm 30% more efficient, and 20% more capable.
It's this technology, of course.
"You'll save energy!" "We've got a world-changing secret!" "Save money and time and power and the world too!!!!" BS BS BS BS BS BS BS!!
Why do I get the sense I'm watching an infomercial?
You're not.
It's a call for investors.
It would fucking silly to go back to analog. I’m a huge analog synth nerd, and let me tell you: the reason why we love them so much compared to digital processing is because they are fucking silly. They decay with time. It needs to be repaired and taken care of forever. You have to let it cool down or heat up when you carry them around, because of the changes of temperature, just like an acoustic instrument. Would like that if you had to retune your cell phone just like a guitar everytime you wanted to sent a file
That truly just feels like that con artist of a health company from few years ago that promised bibbidi bobbid percentage of better blood analyzing that would revolutionize everything.
The real reason why digital replaced analog (and that seems to strangely be lacking from the explication of buddy) is because it is reliable, no matter the state of the system. You could send a gameboy to a bombardment in Afghan and have it eat a full on third degree burn, and that boy would still play your super Mario just fine. Even when a digital system is offset, it works great, because its all one and zero, aka a beep is sent or nothing happens. It can be referenced in time by the system, whereas that if some signal sent was in between 0 and 1, the system is still able to put it amongst the first pile or the nothing pile. Ditch that out of the window with analog; you would have to take extra care of everything. Your friend would sent you a word files, and it would literally crumble over time just like a sheet of paper. Your favorite pictures would fade overtime - an infinite array of pigments activated from a light source? that’s a polaroid for you-. No AI could help you or whatever for reference; he would either end up doing the same universal langage that digital does, but giving you thereafter the illusion that it was all yours, or either just skip that step entirely and just give you something different or degraded. Great
He said that their software fine-tunes the analog signal so that it's reliable digital bits & bytes. So, it's not analog computing. It's a better analog to digital conversion for digital computing.
It's analog computing as it performs transformative operation on analogic data, and can program a pipeline to perform those operations in a specific orders and manner.
It's not a simple analog to digital conversion, which is fixed circuitry and can't be programmed upon at the analogic part of the signal.
@@anykeyh So it's more a soundcard?
@@anykeyh According to his exact words, they fine-tune the analog signal. For what purpose? For that matter, why even do this if you computing based on the analog signal? The answer is right there. What they're doing by calling it analog computing is muddying the waters. It's like calling LLMs artificial intelligence. They're not even remotely intelligent; they're really a complex filtering and composition tools.
1. I've been watching these developments for a while with rapt attention
2. I think the biggest barrier to entry for these devices is purely going to be economy of scale. Once we can get the full production and shipping cost below energy consumption savings for each application, these take off. Fast.
And I can easily see a way for this to be the best way forward in so many places. Right now they're talking sensors, but they can also be incredibly useful for mathematical logic. What I feel like they're not talking about is how this is going to be a game changer for AI in the long run as well where thresholds change decisions. I feel like this is one of the final pieces in a particularly dangerous puzzle, though.
I feel like I'm missing something. I don't understand how energy savings could ever offset the cost of a device like this.
Let's take an "expensive" inefficient industrial communication technology, 4-20 mA, as an example of something this technology could reasonably replace. A 4-20 mA device consumes about 2 kWh or less per YEAR. Yes, 4-20 mA is an analog technology, but most modern devices include microprocessors and can do simultaneous digital communication within a power budget of 4 mA, so it doesn't even cost any extra to add digital functionality to this kind of device.
This new technology would stand to save maybe 50 cents a year in energy costs for this common application. I don't see how that would ever offset costs, and that's ignoring any questions about reliability or signal integrity, etc.
I think this is cool and I'd love to see it gain traction, I just really doubt the whole energy savings argument.
@@Tom-xs7gpRunning chatGPT kosts around $0.03 per query. Precission of the numbers is not realy important for AI so they are allready quantizing models down to int4 or even less (Thats 4 bits and 16 values). If you could doe these calculation analog it would be even more efficient, so I think there is the most to win.
And analog computing wont change everything it will just be used in some places (maybe even on a digital CPU) to handle some stuff faster.
What will be funny is same AI models giving different answers because they are analog and some will take a differnt path.
@@kuro19382I don't know about OpenAI, but there are several quantized and scaled down LAMA LLM's you can test on your own computer. Search for webllm for one that runs some models in the browser on you GPU.
Neural netoworks built out of op amps!
@@The_Conspiracy_Analyst It's ON! It's OFF! It switches only a bit faster than you can operate a light switch...
Let me think from the industry's point of view here, if they can produce the same result at a lower cost to them, it will be done. If the cost is the consumer's responsibility, this is the case with energy prices, the industry doesn't care. Semantics also has importance here, after the millions spent on marketing promoting "digital" as modern, superior, advanced, in relation to "analog" which today is seen as old, backward, inefficient, the analog will change its name as happens with UFO now it's UAP.
Digital has always been much easier to wrap your head around. Because it’s just on or off, there’s no gradient or continuum where the values can be anywhere along to extremes, such as on or off.
That’s why to me the technology of the 70s, 80s, early 90s was such a marvel. The insane wizardry that was required to make those technologies work back then was just incredible. They were technical feats that required real out of the box thinking. Digital it all just becomes a matter of quantity. Such as adding more transistors into an increasingly smaller space. But now that we’re reaching the limits of our current materials, where, no longer getting the returns that we’d like.
So going back to analog while using the current technologies and manufacturing processes that we’ve developed for digital electronics, will be able to eek out more out of the current materials that we use use to make chips without having to turn to more exotic materials.
"UFO now it's UAP" isn't the best example could've gone with. You wouldn't call the Aurora Borealis a "Flying Object", right? Signal flares, lasers (either originating from the ground, or from satellites), rainbows....none of them involve aircraft. So if the current term doesn't fit them, you need a more encompassing term.
@@Vaeldarg yes I made a simplistic analogy but even though you don't exactly agree with it, you understood my point enough to correct me thank you
Don't think that's about the bill, energy consumption changes how things are designed. If you were to implement a monitoring system in your home with hundreds of sensors, it's useless if all of them need battery change every other week, such design is never going to be implemented.
That's an interesting insight. In music production, "digital" typically means harsh, grating, and unmusical, while "analog" is synonymous with warm, lush, and full of natural harmonics. If you peruse Sweetwater and other vendors of musical products, just about every digital device boasts that it sounds more authentically analog than any other digital device ever made. In the same way, nearly all high-end guitar amplifiers still use vacuum tubes. I've tried several of the latest "emulated analog" digital amplifiers, and not one has come close to the tone and squishy responsiveness of my old tube amplifiers.
and on the "slain a dragon" claim, my scam alert signal started blinking like a christmas tree.
also everybody there is repeating the same basic sentences over and over without ever leaning on the how.
"you don't have to convert to digital" HOW "this way it is way more energy efficient" in general analog is NOT efficient, WHY should this be. then they switch to another clown and the new clown repeats those same sentences with a fee words changed, and the cycle repeats.
One big problem
Natural noise effects results that means there could be UN wanted errors and bad presions
But I'm also doing an DIY analog for about 3 weeks and it's precision is about at 1 pico amp and 1 pico voltage
Which sadly is too low for me to easily work with
But this is still powerful precision
Let's hope this company does something better
other than photonic computing (which also have lot of challenges) i don't see analog computing for things we need precision (there actually usage like for low precision AI inference)
Of course, analogue computing isn't meant to replace digital. They have some specific applications, like in the case of training neural networks, as they can still work fine with a significant margin of error.
how is that different than let's say digital signals like wifi: if the device complies with part 15 of the FCC Rules. Operation is subject to the following two conditions: (1) this device may not cause harmful interference, and (2) this device must accept any interference received, including interference that may cause undesired operation.
you must always check your inputs :P
@@iWhacko come on you know what I meant by natural Noise
And you know the space radiation effects etc
And you know that analog computers can calculate all digits instead of 1 at a time
@@LiberalEntrepreneur yeah I hope taking bunch of averages solves these because Advanced AI are effected slightly like 0.1 or something about cost of error(error rate cost)
Its not an analog computer. It's just another sensor
And isn't that already in use in the phones? Like "Hey (S)iri" Passive sensor listens to the "S" frequency band and then wakes up the CPU from the low energy state?
It’s not a sensor. Sensors provide analog data, this takes in analog data and performs calculations therefore it is a computer. It is huge, because now we don’t need to convert the provided sensor analog data into digital data in order to compute it. It rejuvenates Moore’s law in a sense.
Great topic.
I had a flash back to college learning AtoD conconverters🤣
What was old will become new again, and just because some works well does not mean it cant be improved.
Cheers
In my curriculum (Bachelor in Industrial Information Systems), I was also required to interface sensors with automata.
We also had "signal processing" because you can do miracles with the proper circuitry.
Instead of trying to filter unwanted data in code, you can block the signal with RLC.
Or, instead of trying to know if the signal has the same shape, you can have a delay line with a comparator.
Analog is still well and alive!
Tough, I forgot most of it, thanks to programming 😂
So let me get this straight... the say this thing is insanely power efficient, yet the little board they showed houses a huge ass battery -.-
We used to have analog computers - but they were superseded by digital computers. I guess there is a reason for that...
Early in the video you talked about the rate of demand for energy for computing outstripping the rate at which energy is produced (which I guess means something like the rate at which energy production is growing)? But analog computing would only have an impact on that problem if the use cases for the analog computers are the same use cases driving up energy consumption. Later in the video you discuss a number of examples of analog computing being used for sensors, and that discussion morphs very quickly into analog computers enabling NEW kinds of sensors. If analog computers, as power-efficient as they are, result in people's houses being full of novel kinds of sensor, all you've done is increase the amount of energy spent on computing AGAIN: you've created a tech niche for new sensing products. Analog computing only helps overall energy consumption if it replaces EG LLM artificial intelligence, blockchain computation etc - existing high-energy computing tasks.
Get the feeling this was a shill for the company making these. A lot of words were said to describe very little. Not a lot of explanation/background was given to it for a layman to understand its significance.
There is something important that is vaguely explained here. Digital is language, it need encode to transmit and decode to be processed. This is why digital the 'language computer' is easily configured for all purpose, it defines task by reading instruction, this is where much of the energy is consumed. Analog does not use language instead it is a scaled model of the world itself. Like one picture is worth thousand words, computation is much more efficient because that's nature work itself out, not an abstraction. I disagree with the argument of analog always saves power, in term of procedure yes, an analog do not have to run at gig hertz to constantly travel between registers. But being a scalar representation of reality, you cannot rule out some task has to depend on current, the most power consuming part of electrical physics. Also an important aspect of analog system signal need to be loud to resist noise and interference. Which mean signal use more power to boost amplitude. The remote control is also not a good example for analog application, the exact technicality is related to the watchdog timer, nothing to do with the functionality of the device, but the example can be misleading, because a remote control is user interface, UI is an unnatural entity that is very difficult to analog.
Sadly, This is all Fluff and no Stuff, with no real life use case comparisons of power and performance. Seems nothing but a repackaged cpu, that works with some custom analog sensors. It's more like the Fusion research, that never ends, but keeps getting closer to reality every other day. With energy harvesting technology advancement, the digital sensors, are already closer to, ultra low energy reality. Direct analog sensing and processing(digitally eventually), is more like reinventing a better wheel IMHO.♥️👍
Nice a 12 minute video with amazing graphics and audio quality about the achievements in analog with zero explanation of what has actually changed…. So… “WTF is analog computing”?
Love the production guys! Let’s work on the reporting.
I've previously seen the idea of the opposite of an analogue computer waking up digital components. The idea was that it would be a standard digital computer, and it would have an "analogue card" that it would use for tasks that lend themselves to analogue computing, such as facial recognition.
As electronics engineer this is already done on embedded micros you put it to sleep leave the analogue to digital in a low power conversion and it will wake the CPU up on certain conditions (events) this whole video is just getting layman investors.
@@ntal5859 this pitch is total BS, but running an ADC in deep sleep is certainly not "analog computing" by any stretch. One common thing we find in any 10 cent MCU is OPA used as comparator to generate wakeup events, which is very different than ADC continuous acquisition.
I grew up with analog PID controllers in industrial and power plants. When I got my Engineering Degree I specialized in control technology and specifically analog controllers and relay logic (my professors were stunned at how fast I could design a relay system to control things as i had been figuring and troubleshooting them for years in the Navy and commercial power plants by the time I took those engineering classes). The problem with relays is the power they use and how bulky they are... but for some applications they are still the best as they are cheap if there only needs to be a few relays.
I always thought that digital controllers for most industrial plant processes were overkill - and they sure are more time and cost expensive to fix when things go wrong. With analog you just replace the controller card with the right "dip switch" settings and adjust a variable resistor or variable capacitor to get it right (normally takes a few minutes). Then you repair the board by finding the component that has failed.
I'm glad to see analog making a comeback. It sure uses a lot less power than digital computers.
Still don't get the computing part. What I get is they are only talking about analog input hardware. Where is the "computing" parts done in analog way? For example: how you sum 2 numbers? Join both input voltage in series?
Same thing with linear systems versus ( vs ) non-linear systems. In nature, the natural form of things/processes are inherently non-linear. Academia and others conveniently linearize systems to fit convenient calculations and mathematical frameworks.
This guy just said AI and machine learning and made some people think this is a great product. 21th century i guess.
“my voice sounds like breaking glass”
ah, this was a good one-liner
Am I the only one that heard a whole bunch of nothing here? The video makes the claim that "a bunch of energy" is spent converting signals from analog to digital. While I'm almost entirely certain that that's false (heck, even low-power PICs and Atmels have built-in ADCs), they don't spend any time even attempting to defend that claim. They throw around the term "analog computing" but don't get into any specifics. Does their tech still use the Von Neumann architecture, or is it something else? How does it handle concepts like memory and data retrieval? Their demo showed (what I assume is) a neural network being run via their analog tech. That's great, but neural networks are the exact opposite of "easy to program by writing software", which was what they claimed their new technology had solved in the first place.
It honestly just seems that they've created nothing more than... analog sensors that filter/detect things? Where's the analog computing and the software programmability the video began with? I'm honestly not sure what the hype is. Maybe we get a few more low-power passive sensors if their mysterious tech works, but that's about it?
Yup, that was a big load of nada and a lot of half truths and half lies.
I am a simple man. I see a betterhelp ad and I leave. Do better
We have plenty of microcontrollers that have analog input and outputs. Since we're already handling this kind of input and output, I don't understand what we gain by including this directly into a typical APU, or HPC APU. Seems like buzz about nothing.
They are DAC/ADC converters or [de]modulators. The central processor is binary logic based. These guys are taking analog CPUs
The interview did not explain clearly how this "analog computing" works. I look up the details on Aspifinity and seems to me the chip functions as a low power signal threshold detector.
Normal digital processing requires the sensor output signals to be digitalized before comparision which need computational power for the digitalization process.
As for analog processing, the sensor signals can be splitted, transformed, and/or combined with a reference signal for threshold detection which requires zero digitalization power.
The interesting part to me is that the chip must be able to generate a reference analog signal out of the embedded ML model for analog computation. Wonder what is the magic source?
There's lots of types of analog computing, for example, fluidonic computers use tubes, valves and reservoirs to direct fluids around a circuit and depending upon where the fluid winds up, you get different results. It's particularly useful for modeling the economy as you can set the variables and see where the money winds up pooling.
Veritasium has a great couple of videos on this topic
Same here ive seen those long before this im still happy analog will come back
May you provide the title of the videos or the link please, I am curious to have a look on them. Thanks
@@nicolasdujarrier The channel is Veritasium, the videos' titles are; "Future Computers Will Be Radically Different (Analog Computing)", and "The Most Powerful Computers You've Never Heard Of".
@@robbiekavanagh2802 ,
Thanks for that. 💖🙌😺
@@robbiekavanagh2802 Thank you for the clarification 👍
A someone who has been programming computers for multiple decades, this video told me absolutely nothing about how I could use this technology.
Skeptical. Have to see it to believe it. Too much talk that reminds me of a startup trying to get more funding.
'nature analog computer' is not actually not 'analog' nor 'computer'. It is not 'digital'; that's right. But it's definitely 'atomic'. And it cannot 'count'.
The problem here is 'why do we need such not a computer?'
Make Analog Great Again.
Making attorneys get attorneys
Maga hat
Wake word detection and very basic neural networks are a perfect application for this as those don't really need all that high precision, yet require thousands of parallel calculations for a quicker response.
I do feel like in the rush to create a digital future, a lot of the strengths of mechanical systems or analog systems were forgotten. There are a lot of things out there that just really do not need to be digital... a good case in point is a vending machine or a soda fountain at a fast food place. In the past, we had these simple, mechanical vending machines where you put in money or scanned a card, and then pressed a button corresponding to the drink you wanted, or you held down a button to get what you wanted from the fountain, with maybe an LED indicating when a drink had run out with a simple analog sensor. These were replaced with devices that have incredibly laggy touchscreens and menus that you have to hunt through, and all this trouble of greying out disabled items if the drink has run out. It seems laggier and more energy intensive than the old system. The problem with digital and IoT seems to be, in a nutshell, that because we CAN put a digital computer in everything, that means we SHOULD put a digital computer in everything... and in all honesty, there were plenty of mechanical or analog systems that worked just as well or better than the digital systems that replaced them. There are things digital does better, sure, but do we really need a digital computer in everything? Does every device need a touchscreen instead of physical buttons just because a smartphone has one and it seems modern?
Scanned a card? With what? You said "analog only"
I think you're complaining about the wrong thing. Digital is *absolutely* better than trying to make a collection of cogs and levers and such that dispenses the thing.
You can have buttons as inputs to digital things too -- see keyboards -- it's just a matter of the design people make for the vending machine. "The touchscreen is crappy" isn't a problem with digital; it's a problem with deciding to use a touch screen for no reason.
@@pfeilspitze By the way, you can build digital exactly with cogs and levers :) Digital does not mean "modern", and the quantum computers are more analog than they are digital.
@@getsideways7257 Sure, or with siphons, or any number of ways. And we don't, because doing it with transistors really is better.
Mechanical digital calculators exist. They're incredibly finickity, limited, and expensive compared to grade-school-level throwaway calculators using microchips.
@@pfeilspitze Well, it's just a matter of perspective. If the civilization crumbles and we are left with crude tools only (or not so crude), it's not exactly easy to get back to the stage where grade school level calculators on microchips are "throwaways"...
Noise is present in all types of chips unfortunately but analog chips the speed is 2-3 orders of magnitude faster also memory is combined within the chip so foot print is smaller its busless computation so you dont have to read andwrite from separate location for AI tasks could be very energy efficient
There is a potential security issue in 8:32, when the analog processor could not distinguish a recorded glass breaking with the actual glass breaking event. It may generate false alarms and it is gonna be costly. I may be improved to workaround this.
A typical shittalker here.
Every sound detection can be spoofed by playing that on a loudspeaker. Every single one. You may have some fidgety heuristics that you can use to detect "fake" glass breakage sounds, but they can be gamed aswell and you will have more false-negatives.
The sound waves that are generated when you break glass are nearly identical to the ones that are generated when you play that sound on the loud speaker.
Also what is the use case? I want to trick someones alarm system by playing a loud sound of glass breaking? How loud does it have to be to go through the glass that I want to break?
I hear a lot of buzzwords. We'll see how much truth is behind this.
I'm half way in to the video ... and there is nothing substantial.
No hard examples of what makes this fundamentally better than A2D conversion.
How long has quantum computing been sold now, 24 years - and yet there still isn't even one real application.
I think this is the same - too much hype.
I understand this type of video is supposed to dumb down the topic and use buzz words. However at the end of the day this is a repackaged DSP chip which we already is using since the 1980s.
so its basically an efficient comparator module that i can program to detect any input
Thanks!
Not convinced: for the past decade most of the analog circuitry in the vast majority of consumer gadgets has been replaced by eight pin microcontrollers.
I agree, and this video is not very transparent about the details. Additionally, the nature of analogue components make them much more vulnerable to external interference like temperature changes and motion. Without a digital control step between sensors, noise and errors propagate naively through all components in the chain of calculation. This means you will likely go through debugging hell during development and maintenance hell once something is out in production.
Analogue computations are suited for equations with a high number of dynamic parameters whose behavior and interaction can be replicated using analogue counterparts. A historical example of this is the prediction of tides with the position of celestial bodies as input parameters. Another is to aim free-falling bombs from a plane with inputs like the airplane's speed, altitude and the rotation of the earth. This begs the question whether what this video attempts to sell is aimed towards DIY consumers or enterprise level investors.
This is *not* analog computing; if it is, then it wasn't properly explained. Either way, true analog computing is Vaporware and will remain so. I'm old enough to have witnessed so many "disruptive inventions" that never materialized. And yes, prior to being a software developer, I had a degree in electronics. Analog at that.
Analog computing was quite a big thing in 60's and early 70's - both analog and digital computers were manufactured and had their fields of use where one or the other was better. At these times it was common to think that it will remain so in the future as well, ie. both these types of computers will be developed in parallel.
But even in these early days of computing it was well understood that analog computing works well for some specialized applications (where analog computers were blazingly fast, totally outclassing their digital counterparts), mostly signal processing of any kind, numerical solving of differential equations (eg. for the purpose of various mechanical or other physical simulations) etc. - but it is not possible to have a fully analog universal, all-purpose computer. In other words, analog computer does not fulfill the definition of a Turing machine, while digital computer does.
However, the main downside of the analog computers was that they needed to be programmed IN HARDWARE, ie. changing the program actually meant changing the layout of wires on the main switchboard and settings of knobs and switches that controlled operating parameters of specific modules (you may see these modules as equivalents of "functions" or "procedures" in digital software).
Once digital computers became powerful enough, they practically obliterated analog computers because of ease of programming - as it is also said in this video, digital computers can be programmed in software, which was not possible for analog ones. And so we started to "digitize" everything, which is probably not the best way. Return to analog processing in *some* aspects - those where that kind of processing works well - and having the specialized analog part cooperate with general-purpose digital part (what was once called a "hybrid computer") may be a desirable computer architecture of the future, but of course this needs to be done properly.
Great stuff. As an industrial electronics designer, I have been wondering when this would happen. Some years ago when speech conversion was in it's infancy and digital devices were cumbersome at it, I designed an analogue converter that would create the initial phonetics and that would be passed to a micro to deal with the logic. I never built a prototype but it would have used a HUGE amount less processing power.
My industrial designs are all a combination of digital and analogue. I have been involved in analogue electronics all my life so I'm very experienced in that. When I started programming micros a new world opened up some years ago. I see designers out there making software models of analogue that have big programs in DSP's when I would do it with op amps, logic chips and passive components. Much easier. Today, I design with whatever best suits the application. I end up with some interesting combinations of analogue and digital electronics.
I'm not sure how people aren't understanding what is being presented here. The comments are full of people saying "they didn't explain anything" and saying it's useless jargon lol. I get the benefits and understand this right away. It's essentially a way to have processes happen locally without everything having to be digital all the way through. When you add up the potential energy cost savings across the country or even the world, it's enormous. Lots of different devices could probably use analogue computing to complete a process with the help of a digital system somewhere along the line. Maybe credit card machines for example? Or gas pumps? Anything that does some kind of easy and repetitive task and uses a digital set up to do so. I record music and have accumulated a decent amount of knowledge with regards to translating analogue signals to digital ones. I would be curious to see how this could be used to decrease the latency of plugging analogue instruments into an audio interface to turn the analog signal into a digital one.
@@jedimindtrix2142people who dont understand it are people who understand computing and electronics. Yeah yeah cost efficient and stuff but how does it work? There is no explanation whatsoever about how it actually works on an electronic scale. It seems there is no concept but just marketing
You are the exact sort of gullible fool this video is aimed towards. He 100% did not explain the technology at all. Half of your comment makes no sense at all. " It's essentially a way to have processes happen locally without everything having to be digital all the way through" this sentence makes zero sense as it being a single process does not prevent it from being digital but separate nor does it save on any power. How on earth would a credit card machine or gas pump benefit from not being digital hen all the data they deal with is digital anyway? Again that makes absolutely no sense at all. The more repetitive and basic the task the easier it is to use digital. Analogue computing is only used when you need a waveform for perfect accuracy (such as with music), any other time digital is both more efficient and cheaper. This is also not a new technology like you infer, it is older than digital and came first. Analogue computers were made obsolete by digital in 99% of cases. @@jedimindtrix2142
@robinconnelly6079, There was a product called Girl Tech Password Journal in the 1990's that used this very technique. The password was a word or combination of words which would be converted to phonemes and stored. The device, when prompted, would listen for audio matching the phoneme wave patterns and unlock when the match was close. It wasn't a perfect system as the matching had a lot of error in it. I believe there was a switch to set the matching accuracy, basically some percentage of waveform mismatch allowed. It was a very clever device for its time.
The world already melds the analog and digital domain where it makes the most sense. Sensing the glass breaking is by no means a ground breaking function and easily implemented extremely cheaply in software. Making the front end of this device perform a couple of extra "analog" computing steps before being digitized and processed anyway seems unnecessarily trite.
Sorry, "analog computing" will NOT supplant the digital domain in any serious way, not just by 2040, but ever. These units will make a nice peripheral piece to outfit a Raspberry Pi or Arduino experimental or R&D lab and no doubt, some real applications could be practical. However, these guys were large on hype and VERY shallow on exactly HOW and WHY it would be an improvement over status quo. The angle toward power savings is a stretch and immediately makes my BS-o-meter activate. My TV remote already lasts 2-3 years on 2 AA batteries.
You integrated the ad very well. With honesty, respect, not in a cheap way. 👍
This looks like just IOT stuff, and IOT did not really take off like everyone expected. By now we should have had IOT sensors in everything in our homes to monitor the total environment and to check for failures. This never happened.
It would be gest if there was one real example circuit shown
All these "analog computing" devices operate in conjunction with a conventional digital processor. There is no "taking over."
About this video, I thought it was about some analog computing technology, but it.s more a discussion of what could be if that technology would exist
Kind of dissapointed ...
This video is basically just a commercial for Aspinity.
I pay for premium if you have commercial I leave
The first time I saw an analog computer was an analog calculator. Changing where wires would plug into inputs, and it was able to accurately calculate and display certain graphs and such (don't remember specifically what they were). From the way they are going, it appears they are trying to implement analog with digital, which seems like an amazing idea. I look forward to seeing how well it optimizes power consumption in stuff in the future :D
Computing going non-binary? Is no place safe from wokeness?
The lumber jack grabbed his coffee, axe and-a-log.
Imagine creating something like the first arm processor that consumed so low energy that static on the air was enough to make it run but with this technology 🤯
Sooooo, we’re still digitally processing analog signals but with fewer layers of chips and lower power consumption (fewer layers between a sensor and a logical output)? Seems like a tweak on the existing model, but not a entirely different architecture. While not an engineer, I work with analog sensors in industrial controls (I am exactly the guy who needs to know when a bearing is going out before it actually does!), and this seems to fit under the IoT umbrella. I love the developments they are describing, but what is the “revolution”? - integration of traditional components into a single chip? Or is it the power management relative to maintaining signal processing?
(Are they talking about condensing microcontrollers with analog sensors over some common bus, but the bus is integrated into the microcontroller’s processor? “Chip Set” or “Chip”? Are we talking tiny PLCs in a common architecture? Or are we talking about platform for rapidly integrating analog sensors into custom hardware? All good, but I don’t know what they’re differentiating.)
Wow you managed to make an entire video about it and explained so little. How is it programmed? Can it do logical operations, can it do sequential computations, can it store memory? All i take is that it matches some patterns efficiently, that is it.
There's a good reason we're not using analog computers nowadays, it's absolutely hell to make it reliable
As an audio engineer student I could honestly see limitless applications in my field.
but isn't a program used to interpret these analog signals are digital too? and in the first-place digital conversion of some analog signals is necessary because computer programs talk and listens using digital signals.
Right.. that what I dont understand too... and dont the computer applications run digitally? 😂 i dont understand
This is a scam.
Video starts at 3:50
I am not convinced that this is taking over the world, but sure sounds like it would make that guy's day, eh!? The explanation doesn't even make sense. Analog processing AND software programmable. At some point, software is going to need a 16 bit integer (or whatever) to work with to make decisions - that's A2D conversion. So where's the big win? Color me skeptical... next!
"Audio devices might become mostly analog" - you mean, like they all used to be until just a few decades ago?
A walkman did not get more time out of a battery than an MP3 player without a display, btw.
Also, good luck building a mostly analog OP1 :D
I've been dreaming about something like this for years. I remember a world before everything went digital, most devices were electronic (analog.) Much more responsive, much more stable devices. We did have digital, we did have computers, but even those, as I remember, were more responsive.
It’s an on board Fast Fourier Transform on a single chipset working independently of a central processor tuned for a specific wave or multiple waves. I don’t see the hoopla.
FFT is one of the application fields that is specifically very well suited for analog computing. Analog computers were always better at this than digital ones.
Love this channel, love the topics you all find to discuss.Keep up the good work !
Engineer here, there's a reason why analog computing stopped being a thing in the 50s. Problems with residual voltage offsets, mains/AC interference, noise, harmonic distortion, hysterisis, and slew rate only get worse as you try and condense analog circuitry (or run at higher frequencies). The imprecisions and limitations of digital calculations are fixed- a single formula can give you the exact amount of error to expect from roundoff or truncation and you will get the exact same percentage every single time. If there's too much error, you can just use a larger and more accurate storage method (double-precision floating point, for example.) Analog circuitry requires complex nonlinear models to accurately predict/replicate error, and you can't just allocate extra memory to fix it. If this guy somehow managed to invent a super-material that allows large scale integration of analog components without these issues, I'd love to hear more about it.
I sense Scam.
Is it April already?
Shill video. Analog computing will likely only make a comeback for applications that don't need accurate outputs.
The dude's also wrong about iphones "transmitting in analog". A carrier wave does not an analog signal make. It's the "digital is analog with extra steps" argument.
The less power hungry, more sensitive, the smaller analog computers are, the far less accurate.
Anything that needs reliability needs to be digital, or digitized at the source.
Analog sensors waking digital ones has been used as a power efficiency stopgap before. They work best when the sensor is powered by the signal it's sensing, and in that sense, we're already doing a lot of that.
As for audio, you can take my DAC from my cold dead hands. I will always process audio in digital and convert to analog right before the speaker. Quality loss is a BITCH, and that's why even for "analog things", digital is still king.
Imagine using analog neural networks and watch them getting dumb over time
@@SirusStarTV We already have them, and they already do.
Thanks
Thank you! ❤️
Why would it save power? Raw analog signals don't have anywhere near the energy to power any half-way complex circuitry without amplification. And that amplification needs a power source, so there's no such thing as a "constantly off" analog sensor system, at least not one that's somehow superior to digital alternatives, where you can also gate the power source with a transistor and a latch on the sensor line....
Analog computing and Quantum computing, let's see which one is going to happen.
My guess is none !
🎯 Key Takeaways for quick navigation:
00:00 🌐 Analog processors, unlike digital ones, operate based on waves, offering a potential paradigm shift in computing.
00:57 ⚡️ Analog processors promise AI and machine learning capabilities with significantly lower energy consumption, about 1/1000th of their digital counterparts.
03:54 🔄 Software-programmable analog processors can interpret raw analog signals without the need for conversion into digital, a transformative breakthrough in computing.
05:53 📈 The integration of analog computing in devices is expected to reach 30 billion by 2040, potentially revolutionizing how computing energy is consumed and deployed.
06:48 🔋 A combination of analog and digital processing could optimize energy usage, allowing for continuous monitoring with minimal power consumption and extended battery life.
09:19 💡 Analog systems enable more efficient monitoring and insights, particularly in industrial settings, by utilizing minimal power for extended periods and improving resource allocation.
10:47 💓 Analog computing excels in applications like heart monitoring, utilizing power-efficient sensors to detect anomalies and trigger appropriate actions, showcasing its potential in healthcare.
Made with HARPA AI
Analog computing was used in military a lot. I like to use as a great example, an automated target tracking for tanks while in movement. The idea was to always have your gun on target, while riding on uneven terrain. I saw an example of Yugoslavian tanks T 84, it had ultrafast analog computing , efficient, much faster than digital of that time, I believe system was developed in the 70s and used in 84.
Analogue AI will be the birth of the singularity
Sadly this video contains so many incorrect statements, that it just hurts. For example the energy required for converting analog input data to digital is almost non-existent. Take your average game controller for example - it is connected via the digital USB connector. However, every modern game pad has an "analog stick" which reads it's position in voltage (two values, one for x and one for y axis). So does this require "computing" or drain your controller? No of course it does not - almost all analog to digital conversion is done by fixed circuitry working on purely electrical principles which have nothing to do with computing at all. In fact they are nothing but "voltage translators" which encode the digital signal in an analog way. There is no processing involved.
Also this video claims most analog sensor data (at an example of the smartphone) were converted from analog to digital and then back for presentation to the user. This also is wrong, because almost all sensor data is never converted back, but instead used as input for some graphical user interface. Therefore you have to make the data digital at one point, because this is the only language LCDs and other screens can understand. Why? Because a pixel can either be on or off, and the light value has a discrete fixed range. There are no "analog screens". Thus you will always display your sensor data in a digital representation using an application on a given screen.
Another wrong claim is that using this technology (analog computes) we could finally reduce the energy consumption of specific tasks (e.g. voice command recognition) by doing this processing analog. Sorry guy but I have to break it to you - electrical engineers are a very smart bunch - so while we software developers mostly have no idea what they are talking about, they do understand our needs and this is why this command recognition is ALREADY analog with some specific on-board circuitry. Using an analog computer instead of an analog circuit will in fact INCREASE the energy required for this task. At best an analog computer could be used like an FPGA (like a programmable circuit). But an FPGA will still require less energy - it just has to be "programmed" first.
It's too frustrating to even go into all the incorrect statements made here. But long story short, there is a reason why no one cares for analog computers - and no one ever will. It is a niche product for a niche use case. But it can never replace digital computing in the broad sense.