Originally posted by twhiteheadThanks for the info. I just want to say (what I seem to have trouble conveying) that I am wondering if anyone has abstracted the components of the computer in such a way that we might begin to study their various arrangements, types, etc... and the consequential effects on performance in a mathematical/pseudo-physical model kind of a way ( treating the component in question as a macro entity for analysis, not strictly the culmination of its microscopic elements ). Example: The way Classical Mechanics assumes continuum's and ignores the individual atoms in the body even though it is not strictly the truth. It helps tremendously in understanding systems of interactions for organization, optimization, and application. Has anyone begun a macro scale abstraction of computer systems in this manner?
Ultimately a processor contains flowing electrons. So the absolute limits are the speed of electricity (which is a little slower than the speed of light), and the frequency with which pulses (changes in the voltage) can occur and still work a transistor.
Making faster transistors has been one of the key developments towards faster computers as has making ...[text shortened]... phene for the wires as that has lower resistance than current wires and thus produces less heat.
Originally posted by joe shmoYes. Processor designers abstract at many different levels, including dealing with things like heat flow and timing signals.
Has anyone begun a macro scale abstraction of computer systems in this manner?
They have to when they are dealing with billions of transistors.
However this is largely kept inhouse by processor developers and as far as I know not so much a topic of university study. This is partly because it is such a specialized field and partly because a lot of it is proprietary and partly because it is forever changing.
Having said that, I could be wrong. There are courses on processor design on the web such as this one:
https://www.youtube.com/playlist?list=PL34856A71E943207F
I also recommend watching this:
Originally posted by joe shmoI haven't read this article and it's more abstract than the level you seem to be talking about, but might be of interest to you.
Thanks for the info. I just want to say (what I seem to have trouble conveying) that I am wondering if anyone has abstracted the components of the computer in such a way that we might begin to study their various arrangements, types, etc... and the consequential effects on performance in a mathematical/pseudo-physical model kind of a way ( treating the com ...[text shortened]... and application. Has anyone begun a macro scale abstraction of computer systems in this manner?
http://plato.stanford.edu/entries/computation-physicalsystems/
Originally posted by twhiteheadInteresting stuff, thanks for the videos. I haven't watched any of the lectures yet, but when time permits and my interest spikes i'll give them a go.
Yes. Processor designers abstract at many different levels, including dealing with things like heat flow and timing signals.
They have to when they are dealing with billions of transistors.
However this is largely kept inhouse by processor developers and as far as I know not so much a topic of university study. This is partly because it is such a specia ...[text shortened]... 34856A71E943207F
I also recommend watching this:
https://www.youtube.com/watch?v=Jyp6jFCzW44
Originally posted by DeepThoughtYeah, not exactly what I was searching for, but I really enjoy the philosophical side of these types of things; In fact probably more than the actual thing itself. Again Thanks!
I haven't read this article and it's more abstract than the level you seem to be talking about, but might be of interest to you.
http://plato.stanford.edu/entries/computation-physicalsystems/
Originally posted by joe shmoI don't think the lectures cover what you asked about. They appear to be more like an introduction to the architecture side of chip design. What you asked about would either be a more advanced course or something you would learn on the job at a chip design company. There are so few companies designing CPUs large enough for this type of analysis that there is not enough market for courses that advanced and much of it will be trade secrets anyway.
Interesting stuff, thanks for the videos. I haven't watched any of the lectures yet, but when time permits and my interest spikes i'll give them a go.
But the second video did explain that they abstract at a whole range of different levels including testing and simulating the logic, the heat flow and the electricity at each abstraction level.
Originally posted by twhiteheadThe Next Big Thing is going to be photon processing, rather than electron processing. There are already photonic encryption links--encryption is taking place at the level of single photons. The practical application of which draws straight from Heisenberg's principle that observing a photon changes it: namely, if any third party merely measures/duplicates/records the photon along its network path, the other end will detect a change and know that the signal has been read by an unauthorized party.
Ultimately a processor contains flowing electrons. So the absolute limits are the speed of electricity (which is a little slower than the speed of light), and the frequency with which pulses (changes in the voltage) can occur and still work a transistor.
Making faster transistors has been one of the key developments towards faster computers as has making ...[text shortened]... phene for the wires as that has lower resistance than current wires and thus produces less heat.
Originally posted by moonbusI doubt if that will have any significant impact on overall computing in the near future (20 years or so). I can't see how using photos will significantly speed up or shrink computer components.
The Next Big Thing is going to be photon processing, rather than electron processing.
The article googlefudge linked mentions new materials that could dramatically reduce power consumption and heat production which could pave the way for 3D circuitry. Currently the main reason for not going 3D is there is simply no way to get rid of the heat.
Originally posted by moonbusvery unlikely. Electronic signals can already go through the wires at close to the speed of light; but that isn't really the main factor that is limiting speed of microprocessors but rather the limitations of digital frequency of the signals. And then there is the huge power consumption concentrated in one tiny microprocessor that creates a big problem with heat dissipation not to mention the environmentally damaging and expensive energy costs. If you want to know what is the next big thing in computers; Try spintronics;
The Next Big Thing is going to be photon processing, ....
https://en.wikipedia.org/wiki/Spintronics
When the first practical fully spintronic microprocessors are finally developed, they should run at much greater frequencies thus speeds and yet use virtually no power at all!
It has huge potential.
...and then perhaps the next big thing after that may be quantum computers that will use quantum effects to make just a few hundred particles do, in effect, more calculations per second than there are atoms in the known universe.
https://en.wikipedia.org/wiki/Quantum_computing
but so far practical quantum computers are proving extremely tricky to developed and are probability many years away.
Originally posted by humyWhat kills speed in electronics is capacitance and inductance. They are the elephant in the speed room.
very unlikely. Electronic signals can already go through the wires at close to the speed of light; but that isn't really the main factor that is limiting speed of microprocessors but rather the limitations of digital frequency of the signals. And then there is the huge power consumption concentrated in one tiny microprocessor that creates a big problem with hea ...[text shortened]... quantum computers are proving extremely tricky to developed and are probability many years away.
Spintronics doesn't use electric current as such, just the alignment of electrons which is pretty cool, literally🙂
Another road to out of this world speed is the quest for all photonics circuitry.
Use photons instead of electrons. A LOT faster and cooler too.
There is work going on for years to make an all photonics transistor.
Here is one brief wiki about the optical transistor:
https://en.wikipedia.org/wiki/Optical_transistor
All this work is still in the realm of classical computation though, and gates, or gates, XOR gates, Nand gates and such.
If quantum computers get built they will be faster still than even photonic computers.