In 1980, I became a hardware engineer at Boeing Aerospace, the military side of Boeing. We were using simple TTL circuits and wire wrap boards. The easiest way program these circuits was to remove wire wrapped wires and re-wire the whole board. This was considered state of the art. Then I read in EE Times that Cypress made what they called a CPLD with included EDA ABEL. I called them, and they were glad to send me samples plus ABEL. The products worked, and I started the CPLD craze at BAC. Then I moved on to a defense contractor in Orlando, FL. I introduced the CPLD there in 1984. I couldn't believe I was doing this as they had hundreds of hardware engineers. Over the years, FPGAs came along and matured. They were hard to use, but EDA was the issue. They didn't trust it and continued using schematic entry. I was called to work on a brand new project, and we had meetings about how we were going to design. I was in favor of using VHDL, and a very influential engineer wanted schematic entry At at meeting, our management wanted to know why each one of us was so different in the design approach. We were invited to come up there and rationalize our design entry method. The other engineer made some slides and showed that schematic entry was the way to go, mainly because everyone knew how to use the tool. In his conclusion, he said "A picture is worth a thousand words." Then it was my turn. I went up and explained the internal workings of an FPGA and how complicated they were as a function of density. I then showed how VHDL worked, especially using IF THEN ELSE and Case statements. My final conclusion ended in "A word is worth a thousand pictures. Which one is easier to implement?" Of course we went with VHDL, and it became the standard design entry for decades. As these products matured, another EDA came along named Verilog. It was actually long used by the ASIC community, of which we knew nothing about. If you wanted to use Verilog, you had to quit and go with a commercial company. I was a contractor at this defense company and started roaming around other companies, spreading FPGA usage. Xilinx was always trying to get me to switch from Altera to them. One day a Xilinx FAE wanted to talk to me and asked me what I would like in an FPGA that wasn't there yet. I told him that I had to implement my own FIFOs to cross clock boundaries. He immediately saw the benefit of this and went back to the factory. Lo and behold the next line 4000 had built-in FIFOs! Since asynchronous clocks were a big problem, this family made it big everywhere. From that innovation, Xilinx made me an XPERTS partner, and I flourished big time. End of story. Sorry it was so long.
Any advice for someone who wants to learn more about advanver digital design on FPGAs? I learnt about them in my BSc any did basic vhdl but I would love to learn more advanced and useful stuff.
Wow that is an amazing story Koyote, what an experience to see FPGAs start from nothing to where they are now!! I have done some stuff with FPGA's and I think they're great, but I was wondering do you know of any hobby friendly FPGA platforms? Like the Raspberry Pi equivalent of FPGAs? I do have experience with verilog, but I just want something that is user friendly for tinkering with, just thought I'd ask :D
@@zb9458 i learned it during my studies as a photonics engineer and we did it with a xilinx board (i am not entirely sure anymore but i think it was a called zen or something) and used vivado to program it, you can also use verilog depending on what you prefer, it’s a powerful tool when you figured out on how to use it and has a ton of documentations. Maybe that is a starting point?😬 Vivado is definitely not my favourite tool but I must commit that it is very capable.
Another use of FPGAs that wasn't mentioned is that they can stand in for parts that have become unobtainium, like VIC-II chips. This means greatly enhanced capabilities for keeping existing systems in place far longer than the normal manufacturing support processes would otherwise allow.
If your systems architecture isn't prepared for obsolescence, then you are doing it wrong. I do, by the way, know a great horseshoe place. They now offer steel, titanium and 3d printed orthopedic horseshoes. ;-)
FPGAs are what roped me into computer engineering. At my university, we're expected to write a processor architecture from scratch (usually MIPS-based, but I went against the grain and did stack-based), write a "C subset" compiler for your architecture, a very simple operating system, and lastly some networking capabilities. It's crazy that microprocessor programming and FPGA development kits can be had at accessible prices at all.
"It's crazy that microprocessor programming and FPGA development kits can be had at accessible prices at all." -- Not exactly. High tech develops very fast. To catch up with the trends of high tech, and to surpass your competitors, you have to use most advanced devices which cost a lot. For example, the VCK5000 is about $13000. At least not every student can afford.
Back in the eighties, FPGA design tools were difficult and unpredictable. But thankfully with three decades of advancements, FPGA design tools are now difficult and unpredictable.
In 1995 I worked on FPGAs using Xilinx "Foundation" software. It's biggest shortcoming was, that it used long interconnects between cells for some reason and did not calculate the logic level transistion times of these intereconnects correcty. So I had to manually place drivers along the way in order to make the design fast enough to work properly. That was a pain in the neck. Todays software does not have this shortcoming anymore. But I am sure is has many new ones...
I used the first version of Altera software which was supposed to implement the "lock region", where a logic function was squeezed into an area of the array, defined by the user... : it didn't work as expected and almost killed my project in 2000...☹️☹️☹️
Another cool thing about FPGAs is their interconnects. Decades ago, we didn't worry too much about how a signal got from A to B, because wire wrapped and FPGAs implemented them for us. Then came along higher speed circuitry. Electromagnetic fields coupled signals in one wire to another wire, especially if those wires ran parallel to each other. We didn't worry about what was inside the FPGAs, because they seemed not to have this cross coupling problem. Finally we decided to characterize this phenomenon to wire interconnects on the boards. We came up with something the RF engineers used - micro strips and strip lines. Most of you will recognize them, because boards are constructed of layers. These layers are GND, then interconnects, then PWR, then interconnects, and so forth. This solved the glitch problem. CPLD and FPGA manufacturers figured this out much earlier, probably because they came from the ASIC world.
"electromagnetic field couple" that pretty much sounds like how a loop antenna can wirelessly couple with the ferrite antenna in a radio for AM reception
@@madmax2069 Actually,, electromagnetic field,not just magnetic. All wires with current flowing through them have an electromagnetic field surrounding them and diminishing as a function of distance. This field will not induce a current in an adjacent wire unless its changing. The faster the change, the more it will induce. That's why they weren't a problem in the beginning. They became a huge problem in the latter 1990s, especially on circuit boards. Engineers would be looking for functional problems when the problem was glitches being induced into adjacent wires. The best way to solve those problems is to not have them at all.
There are crosstalk effects within the FPGA. They're just built into the timing models as margin (along with a good chunk of on-dir power/gnd noise). If you've closed timing then the degradation isn't enough to break your design.
@@georgegu3374 Yes and no. Digital signals have rise and fall times that are analog. That's why flops have set up times. But then physicists tell us that everything is quantum. One thing I find strange is that light is both particle and wave.
yes, amazing devices. During a project I tried to explain them, with my limited understanding, to a minimal tech person. Found a video or 2 to show him. His only response was Roswell(reference to the supposed UFO crash)
To use a retro gaming analogy such as Analogue products or the MiSTer, an FPGA can be used like a purpose-built system on a chip that powers a clone console. But in contrast to a simple famiclone, it can be reconfigured to clone any one of a selection of other systems in the blink of an eye. And it can receive updates so that the systems it clones can be even more accurate as time goes on. (Even still, this just scratches the surface.)
Great video and very well researched, as always! Note that the name "Field Programmable Gate Array" was meant to invoke familiarity with "Gate Array" (called that in the US, known as ULA, Uncommitted Logic Array, in the UK) which was a chip that had the same logic for all clients except for the metal layer, which was different for each one. Most Gate Arrays used simple NAND gates as their basic element and there was one FPGA that tried that, but it was really smart to have a larger basic block such as a four input lookup table. And with fixed wiring the FPGAs had to add as many transistors to route signals as in the basic blocks.
FPGA are cool and especially for real-time inference with minimal latency. Azure provides FPGA accelerator for certain types of models. Even though I've studied FPGA, I didn't know about the history. A video about Verilog would be cool.
Indeed! Piggybacking on your suggestion, a video about Verilog vs. VHDL and how these languages and synthesis tools helped close the design gap created by these chips and Moore's law would be cool... These days it's you almost never design with individual logic gates, but a tool takes in a high level description (Verilog/VHDL) and spits out hardware (RTL), to the point of HLS (High Level Synthesis) allowing you to write C code and then have the tool generate hardware accelerators for your functions.
I posted below, but I just want to talk more about how FPGAs evolved in the last 5 years - from a user perspective. I became a consultant after 5 years working for BAC. It was a hit or miss thing, and I could have been making a huge career mistake. But I didn't. I've talked below about how I started with the 22V10, CPLDs, then FPGAs maturing. In the past 10 years or so, FPGAs started incorporating software and hardware microprocessors. At first, the designs were done by hardware and software guys working together. I also had a software background, which is why I loved EDA and simulation. But a problem came up, because companies now wanted me to design the hardware AND software. I was well versed in C, C++ and I told them I could do it but would need more time. They wanted me to do it concurrently with the existing schedule. I told them I wasn't their guy. I retired five years ago, and I don't know what the design environment is today. Perhaps some of you would like to comment about what I say above. Here's a cool story. l was referred to a small company that made very precise oscillators. They needed someone to design a CPLD that controlled its frequencies. It was an easy design until I got there. I met with the Engineering Manager along with a young guy 6 months out of college. I told them how I would design it, and the engineering manager told me I had to use C Sharp (C#), because the kid had standardized everything to C#. Of course I thought this was ridiculous as I would be using Vivado to do a very simple design. I told them I couldn't do it in C#, because Vivado didn't accept C# inputs. He asked me what it would take to make it happen, and I said about $1M for Xilinx to design the tool. He decided to go ahead with my method but wanted me to teach the kid how to use the tool. I said "Okay, but he has to be watching what I do so I can explain it to him." Long story short, he never observed me do the design. When I was finished, everything worked great, and the EM asked me if the kid had learned what I did. I said "I can't teach someone if that someone isn't around to be my student." He got mad at me for not getting him, and I told him I wasn't the kid's manager. I got paid, the design was easy and worked very well. Six months later, he called me up and told me they had lost the design files. He wanted to know if I had them. I told him that if I did, I would be violating the NDA we both signed. I got paid a pretty shiny nickel to go back there and redesign it with my arms handcuffed behind my back. There is a moral to this story that I'm sure I don't have to explain.
@@koyotekola6916 I really enjoyed these stories. I'm graduating as an EE with an interest in FPGAs. I hope my future career can be as interesting as yours was.
@@lancecruwys2177 I hope so, too. There will be technological advances that you can get to first, then pioneer them through. Be on the lookout for them. I think your best bet is to become a hardware/software engineer. It can be done, contrary to what people say.
In the 70's we built a lot of amazing stuff with PAL's they removed a lot of TTL chips from our circuits. We even used a PAL programmed to replace a part that a vendor would no longer sell to us due to trying to put us out of business.
Your video essays are consistently some of the highest quality videos on youtube right now and having a new one show up always brightens my day. Thank you!
I designed PLDs, CPLDs and FPGAs in the 1980s. Great to see a history of this exciting and challenging sector. Good work and again great research! Keep up the good work...
I work with FPGAs daily. They’re super super fun. I feel like the depth they’re capable of allows for some incredible creativity. You really really have to grind with them but it’s worth it
It always blows my mind how one tiny chip can become anything i want it to be. One moment its a microblaze system, another moment its a dedicated neural network with specialised matrix multipliers. Like damn, thats plain magic
Great video on one of my favorite subjects. I'd like to add a couple things. First of all (as the poster below said), this history skips a very important branch of IC history, the gate array, which FPGAs (which are a namesake, the Field Programmable Gate Array). Basically gate arrays were ICs that consisted of a matrix of transistors (often termed gates) without the interconnect layers. Since transistors then, and largely even today, are patterned into the silicon wafer itself, this divided the wafer processing into two separate divisions, the wafer patterning, and the deposition of aluminum (interconnect). In short, a customer could save quite a bit of money by just paying for the extra masks needed to deposit interconnects, and take stock wafers to make an intermediate type of chip between full custom and discrete electronics. It was far less expensive than full custom, but of course that was like saying that Kathmandu is not as high as Everest. Xilinx used to have ads showing a huge bundle of bills with the caption "does this remind you of gate array design? Perhaps if the bills were on fire". Altera came along and disrupted the PLA/PAL market and knocked over the king o' them all the 22V10, which could be said to be the 7400 of the PAL market. They owned the medium scale programmable market for a few years until Xilinx came along. Eventually Altera fought back, but by then it was too late. However, Altera got the last word. The EDA software for both Xilinx and Altera began to resemble those "bills o' fire" from the original Xilinx ads, and Altera completely reversed its previous stance to small developers (which could be described as "if you ain't big, go hump a pig") and started giving away their EDA software. Xilinx had no choice but to follow suit, and the market opened up with a bang. There have been many alternate technologies to the RAM cell tech used by Xilinx, each with an idea towards permanently or semipermanently programming the CLB cells so that an external loading prom was not required. Some are still around, but what was being replaced by all that work and new tech was serial EEPROM that was about 8 pins and approximately the cost of ant spit, so they never really knocked Xilinx off its tuffet. My favorite story about that was one maker here in the valley who was pushing "laser reprogrammability", where openings in the passivation of a sea of gates chip allowed a laser to burn interlinks and thus program the chip. It was liternally PGA, dropping the F for field. It came with lots of fanfare, and left with virtual silence. I later met a guy who worked there and asked him "what happened to the laser programmable IC tech?". He answered in one word: contamination. Vaporising aluminum and throwing the result outwards is not healthy for a chip. After the first couple of revs of FPGA technology, the things started to get big enough that you could "float" (my term) major cells onto them, culminating with an actual (gasp) CPU. This changed everything. Now you could put most or all of the required circuitry on a single FPGA and the CPU to run the thing as well. This meant that software hackers (like myself) could get into the FPGA game. The only difference now is that even a fairly large scale 32 bit processor can be tucked into the corner of one. In the olden days, when you wanted to simulate hardware for an upcoming ASIC, you employed a server farm running 24/7 hardware simulations, or even a special hardware simulation accellerator. Then somebody figured out that you could lash a "sea of FPGAs" together and load a big 'ole giant netlist into it and get the equivalent of a hardware simulation, but near the final speed of the ASIC. DINI and friends were born, large FPGA array boards that cost a couple of automobiles to buy. At this point Xilinx got wise to the game, I am sure. They were selling HUGE $1000 per chip FPGAs that could not have a real end consumer use.
AMD actually invented the 22V10 as a bipolar device, but had a lot of trouble making them. Cypress was the first to make a CMOS implementation of the 22V10. Atmel came along a little later. I know this because I worked on it first at AMD, then at Cypress. Many a 14 hour day of my life ....
@@johnhorner5711 When the Cypress 22v10 EE version came out, I built a programmer for it. Cypress was one of the few companies back then willing to disclose programming details. It didn't work! I could program it once, but then it stopped working. Spent a lot of time on the Cypress apps line trying to work it out, never did (maybe it was you I talked to!). I loved that device, I built it into a lot of things. After that I did chip design at Zilog and didn't get back to programmable logic for 10 years, which by that time was dominated by Verilog, and then got back into programmables with Xilinx.
Takes me back to my engineering studies in 99s, programming the FPGAs in VHDL and seeing the Xilinx come in to allow so much more functionality in a sexier looking packages than the dips
Thank you for the trip down memory lane!! 😃 I remember working on designs with the XC2000 and later, the XC3000 family. I used XACT for design and timing closure was sometimes a real challange with several iterations of the place & route overnight and finally manual tweaks at interconnect level to shave off the final ns or so. That low level access and control over the hardware really appealed to me. Some larger logic blocks, e.g. for state machines, were created with PALASM 4 (a language very few people would remember now) which could be imported into XACT through an intermediate format but the main FPGA circuit was schematic based and you could pull 74' series logic gates and other functions from a library. The catch was that some of these functions had subtle differences, compared to the 74 series, catching out the unaware. I could go on and on, with my memories! Since AMD used to make PALs, AMD acquiring XILINX meant that things came full circle. in a way.
9:24 Bill Carter, the designer of the first FPGA (and later a CTO of Xilinx) have said that at first he thought that FPGAs were _"the stupidest thing ever"_-- because they were very large chips, and therefore slow and expensive. Who would choose the slower and the more expensive solution over other alternatives? Although now we know how useful they are, back then the FPGAs were not an obvious winner as a product. Rather subtle factors, like being able to use cutting edge fabrication process before it was available to anybody else (15:46) really helped here. It took some genius to bet on this when starting the company in 1980s.
The economics for FPGA's are interesting. As each process node gets more expensive, the economics of a particular design skews towards FPGA as a choice.
Thank you for the interesting video on an important topic rarely considered outside the world of electronic engineers. I worked at AMD and then Cypress Semiconductor in the late 1980s as a product engineer for programable logic devices. Then, and now, the use to make circuit emulators to prove out designs was but one (relatively small) market segment. Their first market was replacing TTL devices which were then being used to make .... everything! Data General famously created the first 16-bit minicomputer and beat DEC to market by using PALs as a primary logic device element (MMI's early yield problems put that project as risk!). Today the programmable radios used to allow updating cellphones after manufacture are an implementation of programable logic. Many low to mid-volume applications do not justify the creation of full-custom chips and need something catalog products don't offer. This was and remains the key application for programmable logic devices, be they relatively simple PALs or complex devices like Xilinx and Atmel innovated. Amazingly enough, the brand new when I worked on it in 1984 22V10 macrocell PAL is still being manufactured and sold for glue logic and other general/mult-purpose uses. I just checked, and Digikey has them in stock for $2.51 each :). You cover an astonishing array of diverse topics, and I'm amazed how much you get exactly right!
In the late 80s to early 90s, I used a lot of 16V8 and 22V10 PLDs. By the mid-90s, I was using AMDs MACH line of CPLDs, which they sold off to Lattice. Ironic, that they bought Xilinx, years later. Also by the mid-90s, FPGA complexity was large enough so that you could program an 8-bit processor, like the Z80 into one. Vintage 70s 8-bitters live on, as macros, in FPGAs. That the circuit is stored in RAM, can be a plus. It lets you reconfigure your "circuit", on the fly.
Dr David Johanssen from Cal Tech created the first programming environment for FPGAs as his PhD project. I’ve known David for decades and it’s interesting journey he went on in the industry.
I did not expect such an in depth electronics history dive from this channel! I'm not sure where CPLDs would fit in but I was always taught that CPLDs were FPGAs little brothers. FPGAs are exceptionally strong when multiple very fast serial streams need to be modified on the fly. FPGAs are more similar to GPUs in their pipelined operations than a microprocessor even though you can have "Soft cores" programmed into the "logic slush".
CPLDs were and are FPGA's little brother. It's just that FPGA architecture lends itself to growing big and complex. There is still a market for CPLDs, though.
I love FPGAs. I do a lot of dev for science instrument companies that need special front end hardware connected to microcontrollers or full microprocessors to collect the processed data from the instrument. The latest SoCs are just amazing in scope and breadth of what they'll do. Know a single modern FPGA inside and out (and it's dev environment) is literally a full-time job in itself. But they are a lot of fun to make things with.
I was just looking up a video on Verilog, when I looked down and saw that you uploaded. I was not expecting you to have uploaded a video on FPGA. Love you videos
I remember Altera in a Semiconductor assembly company. Initially we have difficulty assembling it in Die Attach and Wirebonds tools: Their choice of leadframes gives us initial headaches as we needed to re configure our tools mechanically!
Given the opacity of this market, I think that you did a phenomenal job with this video. Extremely high effort - thank you! I learned a lot. I'm a true "FPGA hobbyist" in a sense that I make fairly complex designs in my spare time, at 3 - 5k lines of VHDL code per design, to do all sorts of things. But, I don't do it to "proper standards" with obsessive verification and analysis. That's too much of a pain if I'm not paid for it. I work with entry-level chips, so even with 99% chip utilization, synthesis times are only about a minute.. and my approach is always just to synthesize the dang thing and examine the outputs on a logic analyzer in another window. Can't beat looking at physical outputs, and modern logic analyzer UI is faaar better than timing analyzers within FPGA tools. I think that a giant barrier for FPGA usage in hobby settings, is that there is no clear picture of how you'd use an FPGA for a "quick and dirty" design, while avoiding common glitches and pitfalls. However, people who are formally trained for HDL set a bar that's too high for most hobbyists, so those hobbyists just go to microcontrollers instead.. since "quick and dirty" is the name of the game with those. Yet, FPGAs are far superior to microcontrollers for many hobbyist applications. You can't beat having so many I/Os, of being able to nail deterministic timing, scalability (I've "overclocked" so many displays and other chips by feeding them flawlessly-timed signals etc), and even creating new functionality for various chips and devices by simply using the raw speed of FPGAs. Once you get comfortable with FPGAs, you start to find microcontrollers unbearably limited. My algorithms often start conceptually in mspaint (haha) and then I have a lot of fun implementing them on FPGAs. Being in the hobbyist space means that you don't need to obsess over resource utilization (as much). Real designs are going to be some mixture of sequential and combinatorial logic, but when possible, I stick to keeping things combinatorial. You have the spare resources so why not. Why wait 50 clock cycles for a result when you can do it in 0. ;) A lot of algorithms you read about "in the wild" are meant for CPUs, and as such, they're sequential. So, you often end up in uncharted (or at least poorly documented) waters. To make a big but obvious point, I think that current FPGA software environments are complete dinosaurs, and they directly prevent adoption by non-professional users. It's very unfortunate. That said, the software side of FPGA synthesis is absurdly complex. The chips are nothing compared to the complexity of that software. I think that.. for a given market to re-invent itself, you have to find a way to make it accessible. So far, the FPGA industry as a whole has failed at that, and the average electronics enthusiast considers FPGAs too complex or out of reach for their designs. I'll make some videos about it on an electronics channel I have soon enough. Should be fun.
I have never met a non-professional user who had any real need for a complex FPGA design. I do agree that it's a nice hobby, but at 5k lines of code it's still a hobby. Enjoy it and be glad that you don't have to design stuff with a quarter million lines of code, like the big boys. That's no fun. Why pipelining? So that you can get more than 5MHz clock frequency out of these things. If you don't need that, that's fine, but in most of my applications achieving 40-50MHz was an absolute must and getting closer to 100-125MHz was desirable. Asynchronous logic has its place, but it's usually a very small one.
@@schmetterling4477 I half-agree with your first statement. It reflects the current state of the FPGA market, but it's a market that's trying to evolve and become more accessible. Another point is that there are many small/medium size companies out there where engineers end up wearing different hats. Lots of "non-professional" FPGA-designers end up making designs that make their way into manufactured goods - whether they're properly verified or not. Scientific market is another example, where "makeshift" approaches are common, up to a point. Any FPGA design can be complex in the way of timing requirements, processing speed requirements, algorithm implementations, logic efficiency to fit limited form factors and power requirements etc. High codebase complexity in itself doesn't supercede the others. I've made plenty of designs that worked perfectly at 150MHz+ with 95%+ logic utilization even with lower-end chips. With the popularity of SoCs, RISC-V cores etc, I definitely think that the average level of "hobbyist" designs is going up. You can get an Artix-7 dev board with 215k LE's for $100 or a Cyclone-5 dev board with 77k cells for $90. Software is free for both. Both can handle substantial designs. $40k enterprise chips are an entirely different animal, but lots of hobbyists these days have enterprise experience working for Google, Amazon, etc. They understand how things scale. That's why I think that we'll be seeing more hardcore hobbyists sooner than later.
This video was released on the same day I managed to take one "Altera Max EPM7128SLC84-15" to a microscope section at my school´s biology laboratory. I decapped the die using an SMD Rework Station (fancy name for big hair dryer). I will exhibit this and other ICs with the same microscopes in a Technical Fair at my school. Great and accidentally convenient video, as always!
Please note, that you need either (not and or) or (not and and) gates to implement any combinational function. In the picture at 4:41 you can see the not-gates close to the top on the left hand side (the triangles with the circle). Using both and and or makes the chip more versatile and the implementation more compact.
Great video! It feels like microcontroller video is somewhere in the making. One of my professors in his lecture course was explaining to us the architecture of PSoC. He explained to us their ingenious invetion by Cypress semicondutor when they used CMOS in order to make any analog componet via switching transistors with different frequencies using their capacitive resistance. I still remember writing complex resistance equation on the blackboard, but I was a bit sceptical at the time (that it was solely their invention) since he might've been simply connected with that company. He might've known that for sure (he used to fly to San Jose every year to visit some related conferences), but I still couldn't find it on the internet (I didn't try that hard to be fair) whether it's just marketing or PSoCs are actually completely different things from other microcontrollers. That's funny when you remember learning PLAs and PALs when FPGAs were already around for a long time. All these conjunctive normal forms, disjunctive normal forms, and converting it in the end into Zhegalkin polynomial... We had this specific type of torture in having to write all this down by hand which was like 50 A4 pages with all the schematics (also had to be drawn by hand carefully) including all the junctions between AND and OR arrays.
Seriously: THANK YOU ! I really like your channel, but this video sent me back to the early 90s when I programmed my first GAL units while my teachers were teaching us TTL/CMOS... they were already late!
FPGAs are still mind blowing to me - back in 2001, my final year uni project was overseen by a supervisor who casually asked me to conduct a feasibility study where an Atmel FPGA would be used for a specific application which incorporated USB into the FPGA - I bought a book on VHDL and didn't get any further since USB was still pretty new, not to mention trying to get my head around something completely new like VHDL - a few years later I had a lodger who worked for Imagination Technologies and could knock up mobile graphics chip designs using VHDL over breakfast 🤣
Despite them being a really cool idea, in 30+ years I have only ever used programmable logic twice. The first was in the 1990s where I had to replace an obsolete stopwatch IC (maybe a 7225?) with a plugin board. I was just a baby engineer in a company where only seniors got to program microprocessors. Then in the 2010s I concocted some far fetched SPI bus master driving an ADC at full pelt with a micro as SPI slave with DMA storage. It was the only way to get the performance the system required. And it needed EVERY BIT of small print in the ADC datasheet. Really, a different micro would have been better but the development kit for the CPLD was cheaper than a new compiler for the micro. So I did it the hard way.
I worked for Scantron (eg - test scoring machines) when their logic boards, which consisted entirely of discrete gates, evolved into microprocessor machines. I haven’t worked for them in a while but I can see how their boards can evolve back (if they haven’t already) into gates (FPGAs).
Cool! I always wondered how those machines worked. Unfortunately I never got to see them in use in school. Weirdly enough most of the time my teachers would grade the slips by hand. Wonder if the license expired but they still had boxes of slips lol.
Thk you ever so much; I think I finally have these down good enough to start projects. But it was hard as hell & pretty-damn lonely to figure them out almost on my own. PS: Thks for the big-picture. It helps me put the details together. For small/simple/parallel processing, FPGAs can't be beat.
Former non-technical Xilinx employee here. Every day I walked into the San Jose office, I would get this numinous feeling that we were an important part of human history. As I learned more about the products it became clearer. I worked there in 2019-2020, there were still plenty of employees from the late 80’s. One of the VP’s we supported started in 1990 and there were still at least 2 first 100 employees. People who made it to Xilinx never wanted to leave.
Yet another excellent video. Brought back so many memories for me watching this technology develop. Also reminded me of the team of colleagues I knew that programed these devices.
the OLMC (output logic macrocell) usually has configuration bits for it, one for an "inverted" output, "XoR" (per olmc on most models), weather or not to use the flip flop, ("SYN" bit, usually global) and weather or not it is to be used as an input, the GAL16V8 is a really nice chip if all you need is a handful of logic gates for glue logic or more advanced things like full 7 segment display decoders
FPGAs are also at the center of the top performing emulation setups for old video games and computers. For those who are looking for the most accurate representation of actual hardware. There are projects on going right now to make an fpga version of the venerable 6502 that runs at 100 MHz, which is at least 25 times faster than previous commercially available 6502s. And along those same lines, another project on going is the creation of a Motorola 68030 running at 1 GHz.
I really appreciate the work you are doing, it gives viewers a valuable historical analysis of technology that explains how we got to where we are today, and it also provides insight into future possibilities yet to be realized. Thank you!
"CMOS was not available in the Unites States at the time and the company had to go to Ricoh in Japan for it." False. CMOS was invented and commercialized in the US. Ricoh was just the only company making EPROMs from CMOS that was willing to work with Altera.
Within the chip design industry, it's very common to utilize FPGAs as a physical emulation platform for verification of designs prior to tapeout. There are some obvious benefits including being able to emulate designs on a platform which more closely aligns with the end product (instead of simulating through software), but one key advantage is the speed of emulation. Simulation through software is inherently extremely slow, with runtime increasing linearly or more with the number of transistors. However, programming and emulating a design via an FPGA is near instantaneous - with 10MHz or even faster clocks, designs can be fully verified literally within the blink of an eye, compared to countless hours chugging away on a server, eating a node-locked license. It's important to note a few downsides of FPGAs though, mainly the lack of waveform output (at least from my knowledge) since you can't really trace a hundred thousand electrical signals at the same time - from my experience, it's mostly been used as a means of connecting peripherals or sanity check.
One thing you didn't mention that FPGA are use where programs will change in time, so you optimize hardware to run this algorithms faster, as in many transmission device, which are deployed and not change for years as hardware, but software is moving on.
The US military pretty much made them stick and still is a major driver in the products that Altera and Xilinx offer. FPGAs are perfect for .mil signal processing like radar, sonar, avionics, etc. They were also huge on Actel back in the day. Fun stuff.
I started with FPGAs in 2001, and was fortunate to land a job at a company that had an internal training program with university level instructors. The flip side is that I learned VHDL first, and then found Verilog distasteful, but that's another story. Back then, we did all of our HDL entry in text editors and ran them through hideously expensive tools for simulation (Modelsim) and synthesis (Synplify Pro). So expensive that our department had only a few dozen licenses for a department of nearly 100+ folks. People would camp on them, go to lunch, and find their network cables pulled. The real challenge back then was making timing. The chips were fantastic, but once you exceeded a certain percentage of their resources, the clock speeds started dropping dramatically. I was working on signal filtering systems that had to meet a minimum clock rate, and I would spend hours trying to tweak the design to get that last little bit of speed out. The synthesis tools took hours on the PCs of the day. However, as others pointed out, the real shift began when both Xilinx and Altera started giving away limited versions of the tools, and then actually sold reasonably priced FPGAs that would work with them. All of a sudden, you didn't have to be a gigantic megacorp to use the technology. In fact, the company I work for now almost invariably uses Cyclones (Altera house) because most of the control applications we design don't need anything bigger or better - though we do occasionally bump up to Arria or Stratix for bigger jobs.
When studying electronics for a design and build project, our class was warned that using any PAL or ram based implementation would result in a fail. A full set of 7400 series set chips resulted, and a lot of understanding too.
15:53 : SiliconBlue was bought by Lattice and constitutes the core of their newer low-cost offerings, while Actel got bought by MicroSemi which was bout by Microchip... Altera by Intel and Xilinx by AMD.... So Lattice is the last "original" FPGA house now.
"as ASICs get increasingly expensive to design and fab" we're really shooting ourselves in the foot with that. honestly at this point I'm surprised there isn't already a JLC equivalent for 1 micrometer process ASIC manufacturing. feels like it shouldn't be too hard with the advances in machining technology. CNCs can now often hit 5-10um tolerances, and an experienced machinist can hit sub-micron tolerances with grinding and other technologies. and looking at the technology in DLP projectors, I figure it might not be too difficult (comparatively) to make 1 micron stuff without actually having to use masks.
5:05 I actually worked with John on a programmable logic project recently. I didn't know how influential his past work was until several months after the project ended XD. He made very thorough test vectors, a real eye for detail.
Worked for Altera in the early 2000's, fascinating technology. So niche then, remarkable how mainstream the devices are now with things like MiSTER etc. Some of the biggest FPGA's I sold around then were $4K per single device. :/
The fuse burning has made a return a few times over the journey. Nintendo switch uses a similar process to “protect” itself from being downgraded/hacked
Great Video! Slight correction: Not every combinatorial function can be implemented using AND and OR gates, but with NOT, AND, OR it can (completeness)! Which is why most had all three and not only the two.
Before the FPGA were PLA (Programmable Logic Array) chips such as the AMD 22V10 - an AND-OR array of logic with flip-flops on the outputs and programmable by JPEG compatible programmers. FPGA’s are super fast and still used in niche applications. I worked at MMI and Lattice Semiconductor Corporation on these chips.
FPGA's were not super fast. Vendors like Xilinx would quite very high operating frequency but all that was was a D type flipflop configured as a T type and clocked as high as possible. Such a trivial circuit did not use any of the interconnect. Start using the interconnect across, if I recall 30 years later, the CLBS and the performance dropped dramatically. ASIC's were faster and then full custom technology faster still.
The problem with FPGA from the big 2 is that they are so complicated and EDA has long learning curve. There is not enough young engineers learning it. Young engineers are not given opportunity to pickup as projects don't give time for them to learn. My observation is that most designers are in their 40s. Furthermore, Xilinx is more interested in promoting high level languages and AI because it sells bigger and more expensive ICs, so new comers start from there. So who gonna work on the challenging low level stuff in the future?
I have observed the same, there are less engineers that work with HDL directly and know how the FPGA works, year after year. Furthermore the curren Xilinx approach has render the designs dependent of the Xilinx devices. This makes impossible to port the design to other FPGA manufacturers. This has become specially my important over the last year due to the semiconductor shortage.
I remember the TTL: Transistor-Transistor Logic. We used it in our Logic Circuits Lab subject during college days. We use 7000s IC for our truth logic projects in a breadboard.
As someone that have workimg relations with chip maker but have no real technical knowledge of it your video really good at explaining the technology. It sadden me greatly that there is reason why barely any engineer get to talk on stage about their products. I wish more engineer take PR lessons. Maybe then the bean counters don't have to send Bob from marketing to take center stage.
Another fun fact is that today FPGA boards and FPGA based machine (Synopsys Zebu server for example) are used to emulate chips before their fabrication.
There's like 50 people who work on FPGAs, and I'm one of them! WOOOOOOO! Just, I hope you like looking at oscilloscopes or simulated oscilloscopes. But hey, it's the only programming challenge that has no stack, heap, or oversimplified abstracted libraries. The only values for a variable is just 1, 0, High, Low, and **** (What you say when you get an X).
How the heck is cross-talk mitigated at such interconnect speeds now? Software scheduling of "transmissions" close to each other physically to mitigate signal degredation?
@@bluesteelbassTwo things: There's way less interconnects than you think they are (maybe 10s-100s?) for any given unit (limited fan-out), and clock speeds are like 200MHz at the high end, so every signal has some nanoseconds to "settle down" before the next clock cycle. Making sure of that is actually an entire phase of design called Timing. But the FPGA has no ALU, because *every* LUT/CLB is a mini dedicated ALU, so you can pipeline and parallelize to hell and back to make accelerator cards of any configuration. Then the low clock speed is less of an issue if you can push huge data throughput out of PCIE regardless.
@@adissentingopinion848 Ah ha! Programmed "settle" times. I did not realize the clock speeds of the programmable logic was so low comparatively to modern CPU's. When doing Timing optimization, does the software know the architecture of the chip? If a "circuit" is routed along a physically longer path than another, in which both tasks need to be performed in parallel, are there algorithms that perform "wait" times for pieces of info that physically travel faster than others? I would assume the design phase of Timing would correct for issues like this? This was one big problem I had when looking to experiment with these myself. The programming software did not seem to be very in depth for things like utilizing the fpga's physical layout to better route signals for timing and concurrency, integrity, and "code optimization" for the logic you are programming. It seems like Vivado is finally "free" for some of their older product line. This is interesting. If you have any advice/references on pre-built modules/packages, paid or free, that would be fantastic. I am not trying to re-invent the wheel, and people with more experience in this than me have already done great work. I need to find some more recent physical tech specs of these things. Curious as to how much black magic of high speed circuit design is implemented in the chips. What other creative ways solved problems of throughput and cross talk I wonder. This reminds me of PCIe 4.0 specs and speeds, and the PCI slot furthest away from the CPU most of the time would never reach the fully rated Bus Speed. This is due to the distance the signal has to travel across the motherboard, and the crosstalk due to higher bus speeds comparared to PCI 3.0 specs.
@@bluesteelbass The first step is getting good fundamentals of RTL design. I suggest understanding digital logic (Booleans and Kmaps), and search up Greg Stitt. He posts his college courses for free, sublime stuff. His "My tutorial" is like an entire book, and his labs go from baby's first ALU to a whole custom MIPS computer with a custom instruction set! And that's just the first Digital Design course. Then to reconfig 1 and 2. At that point, if you had done all of this, I would hire you over literally any regular CE graduate.
@@bluesteelbass It isn't the clock speed that is amazing in a FPGA, it is the parallel structure. There are hundreds of multiply and accumulate blocks, allowing very fast filtering or video processing. The instructions are encoded in the structure, no need to fetch. Many, very fast, memory blocks are included. These now are dual clock to allow crossing clock domains, which is very challenging otherwise.
Good job! Now that you've designed the FPGA functionality, make sure that you simulate it. The bigger the FPGA design is, the more lyou have to use simulation. With simulation and design requirements written down before you start, you can literally throw the power switch on and have it work correctly the first time! I've done this many times.
@@blackbriarmead1966 Congratulations, and I mean it. Usually, simulating before declaring "it is done" has not been too many engineers' forte. I believe the rush to get something out the door causes many engineers to immediately focus on the functional design. It almost takes a visionary to see how the two work hand in hand, and how the two will unintuitively cut schedule. Even some managers don't understand this concept.
I am an RFIC designer and love your channel. Are you planning to make any videos on analog/high frequency design aspect of integrated circuits? GaN would be an interesting subject as well as beamformers for phased array antennas.
Wauw. This brings back memory to 1986 when I had to give an internal talk about the benefits of FPGA (Xilinx) over PAL when I was an intern at a telecom R&D institute here in the Netherlands. I recall that we had to program PAL chips with a programming language called ABEL. It was for me, with my programming background, very easy to program circuit logic using the programming system. Never heard of ABEL since then. Anyone?
7:40 DRAM has the same problem that PLAs had: The length of the wires stays the same as more and more devices connect to them. The solution is the same, integrate more blocks as you add more transistors. FPGAs were a regression to PROM in that the core element is a tiny SRAM. But FPGAs do scale as the block doesn't grow. As more metal layers were added these could be used to maintain absolute range of blocks.
In 1980, I became a hardware engineer at Boeing Aerospace, the military side of Boeing. We were using simple TTL circuits and wire wrap boards. The easiest way program these circuits was to remove wire wrapped wires and re-wire the whole board. This was considered state of the art. Then I read in EE Times that Cypress made what they called a CPLD with included EDA ABEL. I called them, and they were glad to send me samples plus ABEL. The products worked, and I started the CPLD craze at BAC. Then I moved on to a defense contractor in Orlando, FL. I introduced the CPLD there in 1984. I couldn't believe I was doing this as they had hundreds of hardware engineers.
Over the years, FPGAs came along and matured. They were hard to use, but EDA was the issue. They didn't trust it and continued using schematic entry. I was called to work on a brand new project, and we had meetings about how we were going to design. I was in favor of using VHDL, and a very influential engineer wanted schematic entry At at meeting, our management wanted to know why each one of us was so different in the design approach. We were invited to come up there and rationalize our design entry method. The other engineer made some slides and showed that schematic entry was the way to go, mainly because everyone knew how to use the tool. In his conclusion, he said "A picture is worth a thousand words." Then it was my turn. I went up and explained the internal workings of an FPGA and how complicated they were as a function of density. I then showed how VHDL worked, especially using IF THEN ELSE and Case statements. My final conclusion ended in "A word is worth a thousand pictures. Which one is easier to implement?" Of course we went with VHDL, and it became the standard design entry for decades. As these products matured, another EDA came along named Verilog. It was actually long used by the ASIC community, of which we knew nothing about. If you wanted to use Verilog, you had to quit and go with a commercial company.
I was a contractor at this defense company and started roaming around other companies, spreading FPGA usage. Xilinx was always trying to get me to switch from Altera to them. One day a Xilinx FAE wanted to talk to me and asked me what I would like in an FPGA that wasn't there yet. I told him that I had to implement my own FIFOs to cross clock boundaries. He immediately saw the benefit of this and went back to the factory. Lo and behold the next line 4000 had built-in FIFOs! Since asynchronous clocks were a big problem, this family made it big everywhere. From that innovation, Xilinx made me an XPERTS partner, and I flourished big time. End of story. Sorry it was so long.
Any advice for someone who wants to learn more about advanver digital design on FPGAs? I learnt about them in my BSc any did basic vhdl but I would love to learn more advanced and useful stuff.
very interesting! thank you for sharing
My tears fall as I read your comment. What an amazing story.
Wow that is an amazing story Koyote, what an experience to see FPGAs start from nothing to where they are now!! I have done some stuff with FPGA's and I think they're great, but I was wondering do you know of any hobby friendly FPGA platforms? Like the Raspberry Pi equivalent of FPGAs? I do have experience with verilog, but I just want something that is user friendly for tinkering with, just thought I'd ask :D
@@zb9458 i learned it during my studies as a photonics engineer and we did it with a xilinx board (i am not entirely sure anymore but i think it was a called zen or something) and used vivado to program it, you can also use verilog depending on what you prefer, it’s a powerful tool when you figured out on how to use it and has a ton of documentations. Maybe that is a starting point?😬 Vivado is definitely not my favourite tool but I must commit that it is very capable.
Another use of FPGAs that wasn't mentioned is that they can stand in for parts that have become unobtainium, like VIC-II chips. This means greatly enhanced capabilities for keeping existing systems in place far longer than the normal manufacturing support processes would otherwise allow.
If your systems architecture isn't prepared for obsolescence, then you are doing it wrong. I do, by the way, know a great horseshoe place. They now offer steel, titanium and 3d printed orthopedic horseshoes. ;-)
Your videos are FPGAs - factual, precise with great analogies.
i see what you did there bud
... lel
Inaaaa, shouldn't you be resting?
Thanks for the fun, pertinent, great acronym.
@@diracflux lol
Ugh
FPGAs are what roped me into computer engineering. At my university, we're expected to write a processor architecture from scratch (usually MIPS-based, but I went against the grain and did stack-based), write a "C subset" compiler for your architecture, a very simple operating system, and lastly some networking capabilities.
It's crazy that microprocessor programming and FPGA development kits can be had at accessible prices at all.
What school or what country? That sounds awesome! My assignments suck.
@@Sqwaush UNIFESP, Brazil! Top 500/1000 so it's not too shabby :D
Aren't FPGA in the electronics engineering side of things?
Damn that's quite a task. Did you do it in groups or everyone for themselves?
"It's crazy that microprocessor programming and FPGA development kits can be had at accessible prices at all." -- Not exactly. High tech develops very fast. To catch up with the trends of high tech, and to surpass your competitors, you have to use most advanced devices which cost a lot. For example, the VCK5000 is about $13000. At least not every student can afford.
Back in the eighties, FPGA design tools were difficult and unpredictable. But thankfully with three decades of advancements, FPGA design tools are now difficult and unpredictable.
In 1995 I worked on FPGAs using Xilinx "Foundation" software. It's biggest shortcoming was, that it used long interconnects between cells for some reason and did not calculate the logic level transistion times of these intereconnects correcty. So I had to manually place drivers along the way in order to make the design fast enough to work properly. That was a pain in the neck. Todays software does not have this shortcoming anymore. But I am sure is has many new ones...
I used the first version of Altera software which was supposed to implement the "lock region", where a logic function was squeezed into an area of the array, defined by the user... : it didn't work as expected and almost killed my project in 2000...☹️☹️☹️
Another cool thing about FPGAs is their interconnects. Decades ago, we didn't worry too much about how a signal got from A to B, because wire wrapped and FPGAs implemented them for us. Then came along higher speed circuitry. Electromagnetic fields coupled signals in one wire to another wire, especially if those wires ran parallel to each other. We didn't worry about what was inside the FPGAs, because they seemed not to have this cross coupling problem. Finally we decided to characterize this phenomenon to wire interconnects on the boards. We came up with something the RF engineers used - micro strips and strip lines. Most of you will recognize them, because boards are constructed of layers. These layers are GND, then interconnects, then PWR, then interconnects, and so forth. This solved the glitch problem. CPLD and FPGA manufacturers figured this out much earlier, probably because they came from the ASIC world.
"electromagnetic field couple" that pretty much sounds like how a loop antenna can wirelessly couple with the ferrite antenna in a radio for AM reception
@@madmax2069 Actually,, electromagnetic field,not just magnetic. All wires with current flowing through them have an electromagnetic field surrounding them and diminishing as a function of distance. This field will not induce a current in an adjacent wire unless its changing. The faster the change, the more it will induce. That's why they weren't a problem in the beginning. They became a huge problem in the latter 1990s, especially on circuit boards. Engineers would be looking for functional problems when the problem was glitches being induced into adjacent wires. The best way to solve those problems is to not have them at all.
There are crosstalk effects within the FPGA. They're just built into the timing models as margin (along with a good chunk of on-dir power/gnd noise). If you've closed timing then the degradation isn't enough to break your design.
the real world is in analog mode.
@@georgegu3374 Yes and no. Digital signals have rise and fall times that are analog. That's why flops have set up times. But then physicists tell us that everything is quantum. One thing I find strange is that light is both particle and wave.
You know that FPGAs are magic when they seem even more mystifying and baffling the more you learn about them
Gosh it was hard making sense of it haha
yes, amazing devices. During a project I tried to explain them, with my limited understanding, to a minimal tech person. Found a video or 2 to show him. His only response was Roswell(reference to the supposed UFO crash)
@@clytle374 🤣😅
Absolutely! I love history of ic chips and fpga. Still big fun of fpga.
To use a retro gaming analogy such as Analogue products or the MiSTer, an FPGA can be used like a purpose-built system on a chip that powers a clone console. But in contrast to a simple famiclone, it can be reconfigured to clone any one of a selection of other systems in the blink of an eye. And it can receive updates so that the systems it clones can be even more accurate as time goes on. (Even still, this just scratches the surface.)
Great video and very well researched, as always! Note that the name "Field Programmable Gate Array" was meant to invoke familiarity with "Gate Array" (called that in the US, known as ULA, Uncommitted Logic Array, in the UK) which was a chip that had the same logic for all clients except for the metal layer, which was different for each one. Most Gate Arrays used simple NAND gates as their basic element and there was one FPGA that tried that, but it was really smart to have a larger basic block such as a four input lookup table. And with fixed wiring the FPGAs had to add as many transistors to route signals as in the basic blocks.
FPGA are cool and especially for real-time inference with minimal latency. Azure provides FPGA accelerator for certain types of models. Even though I've studied FPGA, I didn't know about the history. A video about Verilog would be cool.
Indeed! Piggybacking on your suggestion, a video about Verilog vs. VHDL and how these languages and synthesis tools helped close the design gap created by these chips and Moore's law would be cool... These days it's you almost never design with individual logic gates, but a tool takes in a high level description (Verilog/VHDL) and spits out hardware (RTL), to the point of HLS (High Level Synthesis) allowing you to write C code and then have the tool generate hardware accelerators for your functions.
I posted below, but I just want to talk more about how FPGAs evolved in the last 5 years - from a user perspective. I became a consultant after 5 years working for BAC. It was a hit or miss thing, and I could have been making a huge career mistake. But I didn't. I've talked below about how I started with the 22V10, CPLDs, then FPGAs maturing.
In the past 10 years or so, FPGAs started incorporating software and hardware microprocessors. At first, the designs were done by hardware and software guys working together. I also had a software background, which is why I loved EDA and simulation. But a problem came up, because companies now wanted me to design the hardware AND software. I was well versed in C, C++ and I told them I could do it but would need more time. They wanted me to do it concurrently with the existing schedule. I told them I wasn't their guy. I retired five years ago, and I don't know what the design environment is today. Perhaps some of you would like to comment about what I say above.
Here's a cool story. l was referred to a small company that made very precise oscillators. They needed someone to design a CPLD that controlled its frequencies. It was an easy design until I got there. I met with the Engineering Manager along with a young guy 6 months out of college. I told them how I would design it, and the engineering manager told me I had to use C Sharp (C#), because the kid had standardized everything to C#. Of course I thought this was ridiculous as I would be using Vivado to do a very simple design. I told them I couldn't do it in C#, because Vivado didn't accept C# inputs. He asked me what it would take to make it happen, and I said about $1M for Xilinx to design the tool. He decided to go ahead with my method but wanted me to teach the kid how to use the tool. I said "Okay, but he has to be watching what I do so I can explain it to him." Long story short, he never observed me do the design. When I was finished, everything worked great, and the EM asked me if the kid had learned what I did. I said "I can't teach someone if that someone isn't around to be my student." He got mad at me for not getting him, and I told him I wasn't the kid's manager. I got paid, the design was easy and worked very well. Six months later, he called me up and told me they had lost the design files. He wanted to know if I had them. I told him that if I did, I would be violating the NDA we both signed. I got paid a pretty shiny nickel to go back there and redesign it with my arms handcuffed behind my back. There is a moral to this story that I'm sure I don't have to explain.
Your arms handcuffed behind your back? You mean you had to say to others how to do it and not do it yourself?
You were a pioneer in the field of what is now referred to as 'quiet quitting'.
@@haideralikhan5947 Not sure what you mean by that.
@@koyotekola6916 I really enjoyed these stories. I'm graduating as an EE with an interest in FPGAs. I hope my future career can be as interesting as yours was.
@@lancecruwys2177 I hope so, too. There will be technological advances that you can get to first, then pioneer them through. Be on the lookout for them. I think your best bet is to become a hardware/software engineer. It can be done, contrary to what people say.
In the 70's we built a lot of amazing stuff with PAL's they removed a lot of TTL chips from our circuits. We even used a PAL programmed to replace a part that a vendor would no longer sell to us due to trying to put us out of business.
Replacing TTL chips was indeed the first market for PALs.
Your video essays are consistently some of the highest quality videos on youtube right now and having a new one show up always brightens my day. Thank you!
I designed PLDs, CPLDs and FPGAs in the 1980s. Great to see a history of this exciting and challenging sector. Good work and again great research! Keep up the good work...
Wow dats so cool
Had so much fun learning embedded logic design, I mainly worked on basys 3 and xilinx spartan 3 FPGAs.
How very fascinating. I just read about them today in my textbook.
I work with FPGAs daily. They’re super super fun. I feel like the depth they’re capable of allows for some incredible creativity. You really really have to grind with them but it’s worth it
FPGAs always have fascinated me. So counter intuitive to print, plug, and play standardization of tech industry. Thanks for the video!
I am a FPGA lover and developer and I love to see other people getting introduced to these amazing and capable devices. Thanks
I used the 22V10 PAL quite a lot. It was a big upgrade from a breadboard full of TTL.
Me too. Power pigs though
I'm an associate FPGA engineer but never got into the history of it. This was a very eye opening video! I just discovered your channel. Great videos.
It always blows my mind how one tiny chip can become anything i want it to be. One moment its a microblaze system, another moment its a dedicated neural network with specialised matrix multipliers. Like damn, thats plain magic
Yeah, it's bananas.
I love all your videos -- thanks for the excellent research and presentation which goes into all of them.
Great video on one of my favorite subjects. I'd like to add a couple things. First of all (as the poster below said), this history skips a very important branch of IC history, the gate array, which FPGAs (which are a namesake, the Field Programmable Gate Array). Basically gate arrays were ICs that consisted of a matrix of transistors (often termed gates) without the interconnect layers. Since transistors then, and largely even today, are patterned into the silicon wafer itself, this divided the wafer processing into two separate divisions, the wafer patterning, and the deposition of aluminum (interconnect). In short, a customer could save quite a bit of money by just paying for the extra masks needed to deposit interconnects, and take stock wafers to make an intermediate type of chip between full custom and discrete electronics. It was far less expensive than full custom, but of course that was like saying that Kathmandu is not as high as Everest. Xilinx used to have ads showing a huge bundle of bills with the caption "does this remind you of gate array design? Perhaps if the bills were on fire".
Altera came along and disrupted the PLA/PAL market and knocked over the king o' them all the 22V10, which could be said to be the 7400 of the PAL market. They owned the medium scale programmable market for a few years until Xilinx came along. Eventually Altera fought back, but by then it was too late. However, Altera got the last word. The EDA software for both Xilinx and Altera began to resemble those "bills o' fire" from the original Xilinx ads, and Altera completely reversed its previous stance to small developers (which could be described as "if you ain't big, go hump a pig") and started giving away their EDA software. Xilinx had no choice but to follow suit, and the market opened up with a bang.
There have been many alternate technologies to the RAM cell tech used by Xilinx, each with an idea towards permanently or semipermanently programming the CLB cells so that an external loading prom was not required. Some are still around, but what was being replaced by all that work and new tech was serial EEPROM that was about 8 pins and approximately the cost of ant spit, so they never really knocked Xilinx off its tuffet. My favorite story about that was one maker here in the valley who was pushing "laser reprogrammability", where openings in the passivation of a sea of gates chip allowed a laser to burn interlinks and thus program the chip. It was liternally PGA, dropping the F for field. It came with lots of fanfare, and left with virtual silence. I later met a guy who worked there and asked him "what happened to the laser programmable IC tech?". He answered in one word: contamination. Vaporising aluminum and throwing the result outwards is not healthy for a chip.
After the first couple of revs of FPGA technology, the things started to get big enough that you could "float" (my term) major cells onto them, culminating with an actual (gasp) CPU. This changed everything. Now you could put most or all of the required circuitry on a single FPGA and the CPU to run the thing as well. This meant that software hackers (like myself) could get into the FPGA game. The only difference now is that even a fairly large scale 32 bit processor can be tucked into the corner of one.
In the olden days, when you wanted to simulate hardware for an upcoming ASIC, you employed a server farm running 24/7 hardware simulations, or even a special hardware simulation accellerator. Then somebody figured out that you could lash a "sea of FPGAs" together and load a big 'ole giant netlist into it and get the equivalent of a hardware simulation, but near the final speed of the ASIC. DINI and friends were born, large FPGA array boards that cost a couple of automobiles to buy. At this point Xilinx got wise to the game, I am sure. They were selling HUGE $1000 per chip FPGAs that could not have a real end consumer use.
AMD actually invented the 22V10 as a bipolar device, but had a lot of trouble making them. Cypress was the first to make a CMOS implementation of the 22V10. Atmel came along a little later. I know this because I worked on it first at AMD, then at Cypress. Many a 14 hour day of my life ....
@@johnhorner5711 When the Cypress 22v10 EE version came out, I built a programmer for it. Cypress was one of the few companies back then willing to disclose programming details. It didn't work! I could program it once, but then it stopped working. Spent a lot of time on the Cypress apps line trying to work it out, never did (maybe it was you I talked to!). I loved that device, I built it into a lot of things. After that I did chip design at Zilog and didn't get back to programmable logic for 10 years, which by that time was dominated by Verilog, and then got back into programmables with Xilinx.
Love working with FPGA, I like how you have total control on clock cycle level, like in cpu there is so much layer.
I use FPGAs for years but I didn't know the complete history, especially that part about PALs. Very well done.
Takes me back to my engineering studies in 99s, programming the FPGAs in VHDL and seeing the Xilinx come in to allow so much more functionality in a sexier looking packages than the dips
Thank you for the trip down memory lane!! 😃
I remember working on designs with the XC2000 and later, the XC3000 family. I used XACT for design and timing closure was sometimes a real challange with several iterations of the place & route overnight and finally manual tweaks at interconnect level to shave off the final ns or so. That low level access and control over the hardware really appealed to me. Some larger logic blocks, e.g. for state machines, were created with PALASM 4 (a language very few people would remember now) which could be imported into XACT through an intermediate format but the main FPGA circuit was schematic based and you could pull 74' series logic gates and other functions from a library. The catch was that some of these functions had subtle differences, compared to the 74 series, catching out the unaware. I could go on and on, with my memories!
Since AMD used to make PALs, AMD acquiring XILINX meant that things came full circle. in a way.
Huge respect ! We did same work with custom design & layout chip to archive higher efficiency with low power consumption.
9:24 Bill Carter, the designer of the first FPGA (and later a CTO of Xilinx) have said that at first he thought that FPGAs were _"the stupidest thing ever"_-- because they were very large chips, and therefore slow and expensive. Who would choose the slower and the more expensive solution over other alternatives?
Although now we know how useful they are, back then the FPGAs were not an obvious winner as a product. Rather subtle factors, like being able to use cutting edge fabrication process before it was available to anybody else (15:46) really helped here. It took some genius to bet on this when starting the company in 1980s.
The economics for FPGA's are interesting. As each process node gets more expensive, the economics of a particular design skews towards FPGA as a choice.
Thank you for the interesting video on an important topic rarely considered outside the world of electronic engineers. I worked at AMD and then Cypress Semiconductor in the late 1980s as a product engineer for programable logic devices. Then, and now, the use to make circuit emulators to prove out designs was but one (relatively small) market segment. Their first market was replacing TTL devices which were then being used to make .... everything! Data General famously created the first 16-bit minicomputer and beat DEC to market by using PALs as a primary logic device element (MMI's early yield problems put that project as risk!). Today the programmable radios used to allow updating cellphones after manufacture are an implementation of programable logic. Many low to mid-volume applications do not justify the creation of full-custom chips and need something catalog products don't offer. This was and remains the key application for programmable logic devices, be they relatively simple PALs or complex devices like Xilinx and Atmel innovated. Amazingly enough, the brand new when I worked on it in 1984 22V10 macrocell PAL is still being manufactured and sold for glue logic and other general/mult-purpose uses. I just checked, and Digikey has them in stock for $2.51 each :). You cover an astonishing array of diverse topics, and I'm amazed how much you get exactly right!
In the late 80s to early 90s, I used a lot of 16V8 and 22V10 PLDs. By the mid-90s, I was using AMDs MACH line of CPLDs, which they sold off to Lattice. Ironic, that they bought Xilinx, years later. Also by the mid-90s, FPGA complexity was large enough so that you could program an 8-bit processor, like the Z80 into one. Vintage 70s 8-bitters live on, as macros, in FPGAs. That the circuit is stored in RAM, can be a plus. It lets you reconfigure your "circuit", on the fly.
Dr David Johanssen from Cal Tech created the first programming environment for FPGAs as his PhD project. I’ve known David for decades and it’s interesting journey he went on in the industry.
I did not expect such an in depth electronics history dive from this channel! I'm not sure where CPLDs would fit in but I was always taught that CPLDs were FPGAs little brothers. FPGAs are exceptionally strong when multiple very fast serial streams need to be modified on the fly. FPGAs are more similar to GPUs in their pipelined operations than a microprocessor even though you can have "Soft cores" programmed into the "logic slush".
We need more non-Intel electronics history. Just trying to fill a need.
CPLDs were and are FPGA's little brother. It's just that FPGA architecture lends itself to growing big and complex. There is still a market for CPLDs, though.
"With great flexibility comes great misakeability." I love it! 🙂
It's always important to include a typoe when discussing mistakes :)
@@grizwoldphantasia5005 it's actualle typo, not typoe
@@n27272 it was intentional
@@MehdiTirguit whoooosh
I love FPGAs. I do a lot of dev for science instrument companies that need special front end hardware connected to microcontrollers or full microprocessors to collect the processed data from the instrument. The latest SoCs are just amazing in scope and breadth of what they'll do. Know a single modern FPGA inside and out (and it's dev environment) is literally a full-time job in itself.
But they are a lot of fun to make things with.
I was just looking up a video on Verilog, when I looked down and saw that you uploaded. I was not expecting you to have uploaded a video on FPGA. Love you videos
Quick, look up fluidics.
@@brodriguez11000 🤣
I remember Altera in a Semiconductor assembly company. Initially we have difficulty assembling it in Die Attach and Wirebonds tools: Their choice of leadframes gives us initial headaches as we needed to re configure our tools mechanically!
Given the opacity of this market, I think that you did a phenomenal job with this video. Extremely high effort - thank you! I learned a lot. I'm a true "FPGA hobbyist" in a sense that I make fairly complex designs in my spare time, at 3 - 5k lines of VHDL code per design, to do all sorts of things. But, I don't do it to "proper standards" with obsessive verification and analysis. That's too much of a pain if I'm not paid for it. I work with entry-level chips, so even with 99% chip utilization, synthesis times are only about a minute.. and my approach is always just to synthesize the dang thing and examine the outputs on a logic analyzer in another window. Can't beat looking at physical outputs, and modern logic analyzer UI is faaar better than timing analyzers within FPGA tools. I think that a giant barrier for FPGA usage in hobby settings, is that there is no clear picture of how you'd use an FPGA for a "quick and dirty" design, while avoiding common glitches and pitfalls. However, people who are formally trained for HDL set a bar that's too high for most hobbyists, so those hobbyists just go to microcontrollers instead.. since "quick and dirty" is the name of the game with those. Yet, FPGAs are far superior to microcontrollers for many hobbyist applications. You can't beat having so many I/Os, of being able to nail deterministic timing, scalability (I've "overclocked" so many displays and other chips by feeding them flawlessly-timed signals etc), and even creating new functionality for various chips and devices by simply using the raw speed of FPGAs. Once you get comfortable with FPGAs, you start to find microcontrollers unbearably limited. My algorithms often start conceptually in mspaint (haha) and then I have a lot of fun implementing them on FPGAs. Being in the hobbyist space means that you don't need to obsess over resource utilization (as much). Real designs are going to be some mixture of sequential and combinatorial logic, but when possible, I stick to keeping things combinatorial. You have the spare resources so why not. Why wait 50 clock cycles for a result when you can do it in 0. ;) A lot of algorithms you read about "in the wild" are meant for CPUs, and as such, they're sequential. So, you often end up in uncharted (or at least poorly documented) waters.
To make a big but obvious point, I think that current FPGA software environments are complete dinosaurs, and they directly prevent adoption by non-professional users. It's very unfortunate. That said, the software side of FPGA synthesis is absurdly complex. The chips are nothing compared to the complexity of that software. I think that.. for a given market to re-invent itself, you have to find a way to make it accessible. So far, the FPGA industry as a whole has failed at that, and the average electronics enthusiast considers FPGAs too complex or out of reach for their designs. I'll make some videos about it on an electronics channel I have soon enough. Should be fun.
I have never met a non-professional user who had any real need for a complex FPGA design. I do agree that it's a nice hobby, but at 5k lines of code it's still a hobby. Enjoy it and be glad that you don't have to design stuff with a quarter million lines of code, like the big boys. That's no fun. Why pipelining? So that you can get more than 5MHz clock frequency out of these things. If you don't need that, that's fine, but in most of my applications achieving 40-50MHz was an absolute must and getting closer to 100-125MHz was desirable. Asynchronous logic has its place, but it's usually a very small one.
@@schmetterling4477 I half-agree with your first statement. It reflects the current state of the FPGA market, but it's a market that's trying to evolve and become more accessible. Another point is that there are many small/medium size companies out there where engineers end up wearing different hats. Lots of "non-professional" FPGA-designers end up making designs that make their way into manufactured goods - whether they're properly verified or not. Scientific market is another example, where "makeshift" approaches are common, up to a point. Any FPGA design can be complex in the way of timing requirements, processing speed requirements, algorithm implementations, logic efficiency to fit limited form factors and power requirements etc. High codebase complexity in itself doesn't supercede the others. I've made plenty of designs that worked perfectly at 150MHz+ with 95%+ logic utilization even with lower-end chips. With the popularity of SoCs, RISC-V cores etc, I definitely think that the average level of "hobbyist" designs is going up. You can get an Artix-7 dev board with 215k LE's for $100 or a Cyclone-5 dev board with 77k cells for $90. Software is free for both. Both can handle substantial designs. $40k enterprise chips are an entirely different animal, but lots of hobbyists these days have enterprise experience working for Google, Amazon, etc. They understand how things scale. That's why I think that we'll be seeing more hardcore hobbyists sooner than later.
This video was released on the same day I managed to take one "Altera Max EPM7128SLC84-15" to a microscope section at my school´s biology laboratory. I decapped the die using an SMD Rework Station (fancy name for big hair dryer). I will exhibit this and other ICs with the same microscopes in a Technical Fair at my school.
Great and accidentally convenient video, as always!
Please note, that you need either (not and or) or (not and and) gates to implement any combinational function. In the picture at 4:41 you can see the not-gates close to the top on the left hand side (the triangles with the circle). Using both and and or makes the chip more versatile and the implementation more compact.
Great video! It feels like microcontroller video is somewhere in the making. One of my professors in his lecture course was explaining to us the architecture of PSoC. He explained to us their ingenious invetion by Cypress semicondutor when they used CMOS in order to make any analog componet via switching transistors with different frequencies using their capacitive resistance. I still remember writing complex resistance equation on the blackboard, but I was a bit sceptical at the time (that it was solely their invention) since he might've been simply connected with that company. He might've known that for sure (he used to fly to San Jose every year to visit some related conferences), but I still couldn't find it on the internet (I didn't try that hard to be fair) whether it's just marketing or PSoCs are actually completely different things from other microcontrollers.
That's funny when you remember learning PLAs and PALs when FPGAs were already around for a long time. All these conjunctive normal forms, disjunctive normal forms, and converting it in the end into Zhegalkin polynomial... We had this specific type of torture in having to write all this down by hand which was like 50 A4 pages with all the schematics (also had to be drawn by hand carefully) including all the junctions between AND and OR arrays.
Seriously: THANK YOU ! I really like your channel, but this video sent me back to the early 90s when I programmed my first GAL units while my teachers were teaching us TTL/CMOS... they were already late!
Excellent video! Well researched, organized and very understandable. I don’t work in the field, but I couldn’t turn it off. Kudos!
FPGAs are still mind blowing to me - back in 2001, my final year uni project was overseen by a supervisor who casually asked me to conduct a feasibility study where an Atmel FPGA would be used for a specific application which incorporated USB into the FPGA - I bought a book on VHDL and didn't get any further since USB was still pretty new, not to mention trying to get my head around something completely new like VHDL - a few years later I had a lodger who worked for Imagination Technologies and could knock up mobile graphics chip designs using VHDL over breakfast 🤣
Despite them being a really cool idea, in 30+ years I have only ever used programmable logic twice. The first was in the 1990s where I had to replace an obsolete stopwatch IC (maybe a 7225?) with a plugin board. I was just a baby engineer in a company where only seniors got to program microprocessors. Then in the 2010s I concocted some far fetched SPI bus master driving an ADC at full pelt with a micro as SPI slave with DMA storage. It was the only way to get the performance the system required. And it needed EVERY BIT of small print in the ADC datasheet. Really, a different micro would have been better but the development kit for the CPLD was cheaper than a new compiler for the micro. So I did it the hard way.
I worked for Scantron (eg - test scoring machines) when their logic boards, which consisted entirely of discrete gates, evolved into microprocessor machines. I haven’t worked for them in a while but I can see how their boards can evolve back (if they haven’t already) into gates (FPGAs).
Cool! I always wondered how those machines worked. Unfortunately I never got to see them in use in school. Weirdly enough most of the time my teachers would grade the slips by hand. Wonder if the license expired but they still had boxes of slips lol.
As and FPGA engineer thank you making this video. Going to send this to anyone when they ask me what I do.
Thk you ever so much;
I think I finally have these down good enough to start projects.
But it was hard as hell & pretty-damn lonely to figure them out almost on my own.
PS: Thks for the big-picture. It helps me put the details together.
For small/simple/parallel processing, FPGAs can't be beat.
I think they can't be beat for Big/complex/parallel processing either!
There's an open source project called MiSTer that uses a FPGA to recreate various classic computers, game consoles and arcade machines.
Former non-technical Xilinx employee here. Every day I walked into the San Jose office, I would get this numinous feeling that we were an important part of human history. As I learned more about the products it became clearer. I worked there in 2019-2020, there were still plenty of employees from the late 80’s. One of the VP’s we supported started in 1990 and there were still at least 2 first 100 employees. People who made it to Xilinx never wanted to leave.
Fascinating history lesson here. I saw the whole thing from the late 70's on.
Yet another excellent video. Brought back so many memories for me watching this technology develop.
Also reminded me of the team of colleagues I knew that programed these devices.
John, I am extremely thankful for this video.
As a computer science guy I am fascinated by FPGA. I am interested to learn and try one out. Having a processor specialized to your need is just cool.
Seiko don't make ONLY digital watches. They have now (and always have had) fantastic analog/automatic watches.
A new Asianometry video after work? Fricking heck yeah. Keep it poggin', my man.
the OLMC (output logic macrocell) usually has configuration bits for it, one for an "inverted" output, "XoR" (per olmc on most models), weather or not to use the flip flop, ("SYN" bit, usually global) and weather or not it is to be used as an input, the GAL16V8 is a really nice chip if all you need is a handful of logic gates for glue logic or more advanced things like full 7 segment display decoders
I love history about how various IC's came to be.
Started working on FPGAs with XC3000 series and now VU57P on my desk. Both shown at the end of the video. What a great trip.
FPGAs are also at the center of the top performing emulation setups for old video games and computers. For those who are looking for the most accurate representation of actual hardware. There are projects on going right now to make an fpga version of the venerable 6502 that runs at 100 MHz, which is at least 25 times faster than previous commercially available 6502s. And along those same lines, another project on going is the creation of a Motorola 68030 running at 1 GHz.
I learned so many things I didn't even know I wanted to know in this video.
It would be great if you did more FPGA videos. Some really cool chips coming out.
I really appreciate the work you are doing, it gives viewers a valuable historical analysis of technology that explains how we got to where we are today, and it also provides insight into future possibilities yet to be realized. Thank you!
"CMOS was not available in the Unites States at the time and the company had to go to Ricoh in Japan for it." False. CMOS was invented and commercialized in the US. Ricoh was just the only company making EPROMs from CMOS that was willing to work with Altera.
Truth.
This guy is such a good storyteller
Good one! Have been wanting you to cover Xilinx for a while :)
Within the chip design industry, it's very common to utilize FPGAs as a physical emulation platform for verification of designs prior to tapeout. There are some obvious benefits including being able to emulate designs on a platform which more closely aligns with the end product (instead of simulating through software), but one key advantage is the speed of emulation. Simulation through software is inherently extremely slow, with runtime increasing linearly or more with the number of transistors. However, programming and emulating a design via an FPGA is near instantaneous - with 10MHz or even faster clocks, designs can be fully verified literally within the blink of an eye, compared to countless hours chugging away on a server, eating a node-locked license. It's important to note a few downsides of FPGAs though, mainly the lack of waveform output (at least from my knowledge) since you can't really trace a hundred thousand electrical signals at the same time - from my experience, it's mostly been used as a means of connecting peripherals or sanity check.
One thing you didn't mention that FPGA are use where programs will change in time, so you optimize hardware to run this algorithms faster, as in many transmission device, which are deployed and not change for years as hardware, but software is moving on.
The US military pretty much made them stick and still is a major driver in the products that Altera and Xilinx offer. FPGAs are perfect for .mil signal processing like radar, sonar, avionics, etc. They were also huge on Actel back in the day. Fun stuff.
Its perfect because you get near the same performance of an ASIC and you dont need to hand off your designs to a third party
I started with FPGAs in 2001, and was fortunate to land a job at a company that had an internal training program with university level instructors. The flip side is that I learned VHDL first, and then found Verilog distasteful, but that's another story. Back then, we did all of our HDL entry in text editors and ran them through hideously expensive tools for simulation (Modelsim) and synthesis (Synplify Pro). So expensive that our department had only a few dozen licenses for a department of nearly 100+ folks. People would camp on them, go to lunch, and find their network cables pulled.
The real challenge back then was making timing. The chips were fantastic, but once you exceeded a certain percentage of their resources, the clock speeds started dropping dramatically. I was working on signal filtering systems that had to meet a minimum clock rate, and I would spend hours trying to tweak the design to get that last little bit of speed out. The synthesis tools took hours on the PCs of the day.
However, as others pointed out, the real shift began when both Xilinx and Altera started giving away limited versions of the tools, and then actually sold reasonably priced FPGAs that would work with them. All of a sudden, you didn't have to be a gigantic megacorp to use the technology. In fact, the company I work for now almost invariably uses Cyclones (Altera house) because most of the control applications we design don't need anything bigger or better - though we do occasionally bump up to Arria or Stratix for bigger jobs.
to show you the power of flex tape
I REPROGRAMMED THIS CHIP IN HALF
Asianometry, you’re a bit of a legend mate. Cheers.
When studying electronics for a design and build project, our class was warned that using any PAL or ram based implementation would result in a fail. A full set of 7400 series set chips resulted, and a lot of understanding too.
15:53 : SiliconBlue was bought by Lattice and constitutes the core of their newer low-cost offerings, while Actel got bought by MicroSemi which was bout by Microchip... Altera by Intel and Xilinx by AMD.... So Lattice is the last "original" FPGA house now.
Thanks for the great video? Maybe a good idea to have a video on the eASICs as well.
Great history of the FPGA. I use them all the time and love configuring them. They are gems.
The LEGOs of the digital world.
@@brodriguez11000 Well put
@@brodriguez11000 Without the pain of stepping on a stray lego piece / wire wound resistor with leads pointed upwards.
Amazing video with excellent detail. Thank you very much!
The People’s Liberation Army bit got me real good 😂
"as ASICs get increasingly expensive to design and fab"
we're really shooting ourselves in the foot with that. honestly at this point I'm surprised there isn't already a JLC equivalent for 1 micrometer process ASIC manufacturing. feels like it shouldn't be too hard with the advances in machining technology. CNCs can now often hit 5-10um tolerances, and an experienced machinist can hit sub-micron tolerances with grinding and other technologies. and looking at the technology in DLP projectors, I figure it might not be too difficult (comparatively) to make 1 micron stuff without actually having to use masks.
5:05 I actually worked with John on a programmable logic project recently. I didn't know how influential his past work was until several months after the project ended XD. He made very thorough test vectors, a real eye for detail.
Hey man your videos are getting better in terms of quality.
Great video. Really interesting story about the rise of these incredible devices.
Worked for Altera in the early 2000's, fascinating technology. So niche then, remarkable how mainstream the devices are now with things like MiSTER etc. Some of the biggest FPGA's I sold around then were $4K per single device. :/
Still cheap compared to making your own dedicated ASIC...
The fuse burning has made a return a few times over the journey. Nintendo switch uses a similar process to “protect” itself from being downgraded/hacked
Great Video!
Slight correction:
Not every combinatorial function can be implemented using AND and OR gates, but with NOT, AND, OR it can (completeness)! Which is why most had all three and not only the two.
Before the FPGA were PLA (Programmable Logic Array) chips such as the AMD 22V10 - an AND-OR array of logic with flip-flops on the outputs and programmable by JPEG compatible programmers. FPGA’s are super fast and still used in niche applications. I worked at MMI and Lattice Semiconductor Corporation on these chips.
FPGA's were not super fast. Vendors like Xilinx would quite very high operating frequency but all that was was a D type flipflop configured as a T type and clocked as high as possible. Such a trivial circuit did not use any of the interconnect.
Start using the interconnect across, if I recall 30 years later, the CLBS and the performance dropped dramatically.
ASIC's were faster and then full custom technology faster still.
Such great videos on such technical subject matter, great job
The problem with FPGA from the big 2 is that they are so complicated and EDA has long learning curve. There is not enough young engineers learning it. Young engineers are not given opportunity to pickup as projects don't give time for them to learn. My observation is that most designers are in their 40s.
Furthermore, Xilinx is more interested in promoting high level languages and AI because it sells bigger and more expensive ICs, so new comers start from there. So who gonna work on the challenging low level stuff in the future?
I have observed the same, there are less engineers that work with HDL directly and know how the FPGA works, year after year.
Furthermore the curren Xilinx approach has render the designs dependent of the Xilinx devices. This makes impossible to port the design to other FPGA manufacturers. This has become specially my important over the last year due to the semiconductor shortage.
Excellent intro to FPGA. Well done!
I remember the TTL: Transistor-Transistor Logic. We used it in our Logic Circuits Lab subject during college days. We use 7000s IC for our truth logic projects in a breadboard.
Another excellent ep, thank you!
Woo, very important subject rarely mentioned by YT channels, since most of them considered it sort of sorcery.
Optical and FPGA could be interesting.
This video is the best homage to the humble FPGA I've seen thus far
As someone that have workimg relations with chip maker but have no real technical knowledge of it your video really good at explaining the technology.
It sadden me greatly that there is reason why barely any engineer get to talk on stage about their products.
I wish more engineer take PR lessons. Maybe then the bean counters don't have to send Bob from marketing to take center stage.
The head of AMD does pretty well.
Another fun fact is that today FPGA boards and FPGA based machine (Synopsys Zebu server for example) are used to emulate chips before their fabrication.
Yeees glad to finally see this idea come to life :)
There's like 50 people who work on FPGAs, and I'm one of them! WOOOOOOO!
Just, I hope you like looking at oscilloscopes or simulated oscilloscopes. But hey, it's the only programming challenge that has no stack, heap, or oversimplified abstracted libraries. The only values for a variable is just 1, 0, High, Low, and **** (What you say when you get an X).
How the heck is cross-talk mitigated at such interconnect speeds now? Software scheduling of "transmissions" close to each other physically to mitigate signal degredation?
@@bluesteelbassTwo things: There's way less interconnects than you think they are (maybe 10s-100s?) for any given unit (limited fan-out), and clock speeds are like 200MHz at the high end, so every signal has some nanoseconds to "settle down" before the next clock cycle. Making sure of that is actually an entire phase of design called Timing.
But the FPGA has no ALU, because *every* LUT/CLB is a mini dedicated ALU, so you can pipeline and parallelize to hell and back to make accelerator cards of any configuration. Then the low clock speed is less of an issue if you can push huge data throughput out of PCIE regardless.
@@adissentingopinion848 Ah ha! Programmed "settle" times. I did not realize the clock speeds of the programmable logic was so low comparatively to modern CPU's. When doing Timing optimization, does the software know the architecture of the chip? If a "circuit" is routed along a physically longer path than another, in which both tasks need to be performed in parallel, are there algorithms that perform "wait" times for pieces of info that physically travel faster than others? I would assume the design phase of Timing would correct for issues like this?
This was one big problem I had when looking to experiment with these myself. The programming software did not seem to be very in depth for things like utilizing the fpga's physical layout to better route signals for timing and concurrency, integrity, and "code optimization" for the logic you are programming.
It seems like Vivado is finally "free" for some of their older product line. This is interesting.
If you have any advice/references on pre-built modules/packages, paid or free, that would be fantastic. I am not trying to re-invent the wheel, and people with more experience in this than me have already done great work.
I need to find some more recent physical tech specs of these things. Curious as to how much black magic of high speed circuit design is implemented in the chips. What other creative ways solved problems of throughput and cross talk I wonder. This reminds me of PCIe 4.0 specs and speeds, and the PCI slot furthest away from the CPU most of the time would never reach the fully rated Bus Speed. This is due to the distance the signal has to travel across the motherboard, and the crosstalk due to higher bus speeds comparared to PCI 3.0 specs.
@@bluesteelbass The first step is getting good fundamentals of RTL design. I suggest understanding digital logic (Booleans and Kmaps), and search up Greg Stitt. He posts his college courses for free, sublime stuff. His "My tutorial" is like an entire book, and his labs go from baby's first ALU to a whole custom MIPS computer with a custom instruction set! And that's just the first Digital Design course. Then to reconfig 1 and 2. At that point, if you had done all of this, I would hire you over literally any regular CE graduate.
@@bluesteelbass It isn't the clock speed that is amazing in a FPGA, it is the parallel structure. There are hundreds of multiply and accumulate blocks, allowing very fast filtering or video processing. The instructions are encoded in the structure, no need to fetch. Many, very fast, memory blocks are included. These now are dual clock to allow crossing clock domains, which is very challenging otherwise.
I just programmed my first fpga 2 days ago. Very cool to think about
Good job! Now that you've designed the FPGA functionality, make sure that you simulate it. The bigger the FPGA design is, the more lyou have to use simulation. With simulation and design requirements written down before you start, you can literally throw the power switch on and have it work correctly the first time! I've done this many times.
@@koyotekola6916 yes, I used verilog and got the tests to work before programming the fpga and it worked first time 😎
@@blackbriarmead1966 Congratulations, and I mean it. Usually, simulating before declaring "it is done" has not been too many engineers' forte. I believe the rush to get something out the door causes many engineers to immediately focus on the functional design. It almost takes a visionary to see how the two work hand in hand, and how the two will unintuitively cut schedule. Even some managers don't understand this concept.
I am an RFIC designer and love your channel. Are you planning to make any videos on analog/high frequency design aspect of integrated circuits? GaN would be an interesting subject as well as beamformers for phased array antennas.
I really, really respect RFIC engineers!
Wauw. This brings back memory to 1986 when I had to give an internal talk about the benefits of FPGA (Xilinx) over PAL when I was an intern at a telecom R&D institute here in the Netherlands. I recall that we had to program PAL chips with a programming language called ABEL. It was for me, with my programming background, very easy to program circuit logic using the programming system. Never heard of ABEL since then. Anyone?
7:40 DRAM has the same problem that PLAs had: The length of the wires stays the same as more and more devices connect to them. The solution is the same, integrate more blocks as you add more transistors. FPGAs were a regression to PROM in that the core element is a tiny SRAM. But FPGAs do scale as the block doesn't grow. As more metal layers were added these could be used to maintain absolute range of blocks.
Thank you friend and teacher
Would have liked to hear of the applications at the beginning of the vid. Great vid. New subscriber.