It is worth to mention that Harvard is very common in microcontrollers. They fetch instructions directly from Flash that can be organized in any number of bits per word (12, 14, 16, 24, in Microchip PIC μControllers).
seeing you respond to all these comments brings me great joy, how lucky your students must be to have someone like you and how lucky I am to stumble into your channel whilst revising! Refreshing to have teachers like you in the world. Keep at it !! :D
The truth is that modern computers have moved on a long way from the original von Neumann architecture (co-processors, GPUs, pipelining, cache, etc.), although the essence of von Neumann remains (i.e. programs in memory). I suppose everything comes down to expense and we we are prepared to pay. :)KD
@@ComputerScienceLessons New instruction sets are constantly been added by programs we download, so it makes more sense to incorporate both programs and memory in the same space. It also allows in some cases for increased performance with the trade of of increased power consumptions. This is why fitness trackers may opt for something closer to a modified harvard architecture, as the separation between program and memory means less local power consumption
The Von Neumann architecture made sense when memory was very limited and you could adjust the amount that both instructions and data needed. But with today's memory sizes, it seems to me that the Harvard arch. would male more sense.
Can we read from or write to or read and write at 2 different memory locations simultaneously if we use 2 address and 2 data buses in the von neumann architecture?
@@krzysztofjarek6476 These two yes. But it would be safe to assume that not only british and americans were working on computers and thus there might have been some other architectures. I just assume that we use these two just because numerous other architectures proven themselves inferior. Just like basically everything in engineering.
And let's not forget cost; it all comes down to what we are prepared to pay. A lot of effort these days goes into the development and marketing of GPUs and and memory, driven largely by the gaming market and AI. :)KD
im latinamerican speaking spanish. like it your videos. only i like spanish subtitles. some words i recognice, other themes and words dont catch what you say.?
It is worth to mention that Harvard is very common in microcontrollers. They fetch instructions directly from Flash that can be organized in any number of bits per word (12, 14, 16, 24, in Microchip PIC μControllers).
seeing you respond to all these comments brings me great joy, how lucky your students must be to have someone like you and how lucky I am to stumble into your channel whilst revising! Refreshing to have teachers like you in the world. Keep at it !! :D
Thank you so much. You made my day :)KD
so Modified Harvard arch = harvard between CPU and memory and Von Neumann inside CPU?
hey, could you share the images used in this video?
Thank you for the lesson. Clear, concise and well presented.
You are most welcome :)KD
You Made my day great.
This brings me to the question, why is von neumann architecture the most popular and most used architecture?
The truth is that modern computers have moved on a long way from the original von Neumann architecture (co-processors, GPUs, pipelining, cache, etc.), although the essence of von Neumann remains (i.e. programs in memory). I suppose everything comes down to expense and we we are prepared to pay. :)KD
@@ComputerScienceLessons oh, I see, I was thinking the same question earlier, thank you :)
@@ComputerScienceLessons New instruction sets are constantly been added by programs we download, so it makes more sense to incorporate both programs and memory in the same space. It also allows in some cases for increased performance with the trade of of increased power consumptions. This is why fitness trackers may opt for something closer to a modified harvard architecture, as the separation between program and memory means less local power consumption
mafia probably
@@nathanjahzielmuncal2699i unsubbed
quite a way to start the video i must say, great content!
Thank you. I really appreciate your comment.
I'm so glad I stumbled upon your channel :D
great explanation, thank you
You are very welcome :)KD
The Von Neumann architecture made sense when memory was very limited and you could adjust the amount that both instructions and data needed. But with today's memory sizes, it seems to me that the Harvard arch. would male more sense.
Can we read from or write to or read and write at 2 different memory locations simultaneously if we use 2 address and 2 data buses in the von neumann architecture?
thank you so much
You are very welcome :)KD
Are there any other weird architecture that were less effective/successfull?
Both architectures are used in modern CPUs- so why "less effective/successfull"?
@@krzysztofjarek6476 These two yes. But it would be safe to assume that not only british and americans were working on computers and thus there might have been some other architectures. I just assume that we use these two just because numerous other architectures proven themselves inferior. Just like basically everything in engineering.
And let's not forget cost; it all comes down to what we are prepared to pay. A lot of effort these days goes into the development and marketing of GPUs and and memory, driven largely by the gaming market and AI. :)KD
@@ComputerScienceLessons Nani?
describe the harvard architecture available for designing a device that is able to work on its own? Kindly assist..
Tbh idk 😅
brilliant, thank you!
amazing
:)KD
im latinamerican speaking spanish.
like it your videos.
only i like spanish subtitles.
some words i recognice, other themes and words dont catch what you say.?
Hola. Intentaré hablar con más cuidado en el futuro, para que sea más fácil de traducir. :)KD
🙏
😊 :)KD
It’s typically pronounced Von (or Fon) Noyman rather than ‘Newman’.
omg first!
Congratulations
🏆 Ta Da :)KD