Researchers develop ultra-fast light-based microprocessor

Published by

Click here to post a comment for Researchers develop ultra-fast light-based microprocessor on our message forum
data/avatar/default/avatar03.webp
The major point of the photonic logic is the significant power savings that directly affect the perf/Watt ratio. Even if only applied to the memory and peripheral I/O it would still net a major progress, since moving data over external interfaces is one of the biggest power-demanding operations. Also, optic signals can work at orders of magnitude higher frequencies, expanding the available bandwidth without the need of expensive and power-hungry wide copper wiring for ICs and PCBs. That pretty much is evident in the long-range wired communications -- all of the seafloor cables are fibre-optic for the same reasons.
https://forums.guru3d.com/data/avatars/m/241/241896.jpg
Don't you just love the word Photonic 🙂 Technology is advancing faster and faster as time passes, one can only dream of what devices we will being using in 20yrs from now and how it will be incorporated into our daily lives plus how technology is also used by the medical field ,eg prosthetic mechanical limbs . Im sure that one day we will be able to have microprocessors implanted directly in our brains (hope it's overclockable) lol .
data/avatar/default/avatar06.webp
In next 5 years, I do not see myself putting processor into socket which has only few pins for power delivery. Doing same for memory, graphics card, sound card, LAN and all other stuff. And then plugging optical cables, making ugly octopus in process. Unless it is easy to install and as reliable as today's systems, it will never take off.
All things considered, LGA sounds like the way to go, but instead of pins, it's a flat-surface array of optic-grade transparent ends, flush with the top layer of the surface pressing against the chip. (I suck at describing stuff)
https://forums.guru3d.com/data/avatars/m/258/258688.jpg
I've been reading about light-based processors for the last five years (or more.) Every time a company feels the need for more capital investment they'll put out one of these little blurbs promising "breakthroughs" that force you to read the fine print to discover that they are at least a decade away (or more) from any actual marketable products that are based on the "breakthrough." (Doesn't matter the field or the product.) Translation: don't hold your breath waiting on this, but please give us your money so we can keep working on it. I may be cynical, but a "breakthrough" for me is: "Hi! We've designed an optical processing cpu that will hit the market in 6-9 months and it will forever change your perspective on computing performance. Hang on!" That's a breakthrough...;) (Or, it will be in 6-9 months.)
https://forums.guru3d.com/data/avatars/m/175/175739.jpg
That supposed joke lost all meaning considering all Modern computers can run Crysis. That joke is as dead as a dodo.
https://forums.guru3d.com/data/avatars/m/227/227853.jpg
In next 5 years, I do not see myself putting processor into socket which has only few pins for power delivery. Doing same for memory, graphics card, sound card, LAN and all other stuff. And then plugging optical cables, making ugly octopus in process. Unless it is easy to install and as reliable as today's systems, it will never take off.
Be serious, it would be nothing like that. Mainboards with tiny fiber optic chamfers replacing the current PCBs' electrical circuitry come to mind. Actually a hybrid of both since you still need electrical power. Sockets with optical channels and just a few pins for power. This could also be applied to volatile memory, but I don't see a way to include this technology into non-volatile memory. It has potential.
https://forums.guru3d.com/data/avatars/m/265/265607.jpg
I've been reading about light-based processors...
That's because it's currently impossible to make an entire optical processor as there is no way to make optical memory. No one even thought of a principle. The best we can do is a delay line for multiplexers. But most of the processing is still done by electric circuit. This is exactly the same, all they did is switch electrical buses to optical. Which is good as an idea because it can improve throughput, but I think quite impractical. As you still need at least a decent diode for transmitter and reciever on top of optical fiber (or a waveguide), that would be both quite hard to integrate. Now it's not impossible, just really expensive, as it would take a lot of space of the chip. Now if someone made a fully optical memory, we would be able to switch to fully otical processors in say a dacade time. But as it is right now, we can only mix it and I think the benefit of it is not worth the price and effort. But it is still something worth exploring as quite soon we would reach the limit of integration and miniaturisation and we will need to change the technology. As there is about 1,5 nm between two atom layers and some quantum effects can be observed even now with 8nm integration.
https://forums.guru3d.com/data/avatars/m/242/242134.jpg
sorry, but the headline is wrong. any quad channel intel (consumer) will reach those (thru/output) numbers with ddr3. and if its about the energy savings (not the performance), wouldnt "Researchers develop ultra-efficient thingy based on a "optical" microprocessor.." make more sense?!
https://forums.guru3d.com/data/avatars/m/72/72189.jpg
changing electricity to light and back does small delay. So fail.