By Andrew Nusca
Posting in Design
Intel's new research prototype demonstrates that light beams can be used to replace electrons to transport data in computers of the future.
Intel on Tuesday announced a research prototype demonstrating that light beams can be used to replace electrons to transport data in computers of the future.
The field of study is called "silicon photonics," and Intel says it matters quite a bit.
Why? Because it could enable futuristic applications of technology such as:
- A wall-sized, three-dimensional display with a resolution so high that friends, family or business colleagues appear to be in the room with you.
- A datacenter or supercomputer that can have its components spread throughout a building or an entire campus instead of in one location and confined by heavy copper cables.
- Whether an Internet company, security firm or financial institution, a way to boost the performance and reduce the costs (money, space and energy) of serious number-crunching.
Intel built a prototype of the world's first silicon-based optical data connection with (wait for it...) integrated lasers.
That matters because the optical link can move data over longer distances and much faster than copper cables: up to 50 gigabits of data per second.
(If that means nothing to you, try this: an entire high-definition movie being transmitted each second.)
Copper cables offer reduced signal strength over long distances, which is why most supercomputers are also super-compact.
Lasers for transmitting information aren't a new thing. But Intel's interest is to move beyond the use of exotic materials such as gallium arsenide and instead use readily-available silicon, which can make such computing power available to the broader market.
Intel CTO Justin Rattner demonstrated the company's 50Gbps "Silicon Photonics Link" at the Integrated Photonics Research conference in Monterey, Calif.
The project is somewhat of a research milestone for the iconic tech company: it's comprised of the first hybrid silicon laser and the first high-speed optical modulators and photodetectors.
Here's how it works, step by step:
- The transmitter chip is made of four hybrid silicon lasers.
- Light beams from the lasers each travel into an optical modulator.
- The modulator encodes data onto the beams at 12.5Gbps.
- The four beams are then combined and output to a single optical fiber, for a total data rate of 50Gbps.
- At the other end of the link, the receiver chip separates the four optical beams and directs them into photo detectors.
- The photo detectors convert data back into electrical signals.
There you have it -- ultra-fast computing power.
Here's a video explanation:
The key point: the chips involved can be made using the same cheap processes used by the semiconductor industry.
Intel says its researchers are already working to increase the data rate by scaling the modulator speed. They're also trying to increase the number of lasers per chip.
The ultimate goal: terabit-per-second optical links, or fast enough to transfer all of the data on a laptop in one second.
Here's a demonstration in a video clip:
Photos: Intel's 50Gbps Silicon Photonics transmit module; University of California at Santa Barbara professor John Bowers and Intel Labs fellow Mario Paniccia. (Intel)
Jul 27, 2010
* Yawn * What about the bio-engineered pig's brain cell from the early '80's read by lasers? If I remember rightly, each cell was capable of about 1GB of storage. Molecular computing - that's the way to go. Of course, brings a whole new meaning to getting a virus on your PC... ;)
Sweet. How senstive to EMP are these systems? If there is a low vulnerabilty then this would be important for control systems for the electical grid as regards sun-spots and other threats. Also the DOD could be a good sorce of funding as this would pertain to EMP sheilding on equpment. Tanks, ships and aircraft must carry heavy sheilding. If less sheilding is nessary this would at least improve power to weight ratios and takeoff loading. Also spacecraft could benifit if this is more resistant.
Dear Larry, and Andrew: There are many proof of concept ideas that are literally decades old. Beyond the proof of concept is design for manufacture at a price point where the market will adopt for competitive advantages. Intel like many other companies such as AMD, Analog Devices, Texas Instruments, etc., etc., have the same design for manufacture problems. Part of this is having access to a fab, and the other is developing the manufacturing technology. Intel, like many others, as usual has done a poor marketing job in presenting and positioning their R&D efforts. Few in marketing have a true understanding of the implications of technology and how to present their achievements in context for the market. There is more to the market than marketing to the CEO. All the Best, Vladimir*********
50Gbps is slow, we already have 100Gbps per fiber and is a standard too. Nothing new here. Now 1Tbps, that I want to see.
50Gbps? I believe a high def movie would be 50 gigabytes, not gigabits. Now if you meant 50 gigabytes per second, that would be amazing. The optical cortex can transmit about 52 gigabytes per second. * One hour of SDTV video at 2.2 Mbit/s is approximately 1 GB. * Seven minutes of HDTV video at 19.39 Mbit/s is approximately 1 GB. * 114 minutes of uncompressed CD-quality audio at 1.4 Mbit/s is approximately 1 GB. * A DVD-R can hold 4.7 GB. * A dual-layered Blu-ray disc can hold 50 GB. * A Universal Media Disc can hold 0.9 GB of data. (1.8 GB on dual-layered discs.) http://en.wikipedia.org/wiki/Gigabyte
Knew that in 1984 that it was possible. Amazing things are going to happen. How about data bus width of 100' s of megabits or more, Tera hertz clocks... And more much more.
So, like, this should help run Crisis ok? :D This seems to be great news for computing, but what are the implications for networking? and for storage? Can we store information at that speed already? Can we retrieve data at that speed from storage? Are RAM-Drives coming back with a vengeance at last?