Thinking Tech

Innovative fiber optic technology obliterates data speed record

Innovative fiber optic technology obliterates data speed record

Posting in Cities

Fiber Optic researchers who have been experimenting with new and faster data technology report record-breaking results.

With much of the internet chugging along on phone lines and 56k modems, the arrival of broadband was once considered a godsend. But now as data intensive technologies like cloud computing and live streaming of high-def video become ubiquitous, many internet users are starting to feel a sense of cyber gridlock coming on again.

Not to worry though. Fiber-optic problem solvers of the world seem to have anticipated that the day may come when seamless data transfers would require either an upgrade to a different kind of technology or better pipes, so to speak. Researchers who have been experimenting with both approaches recently ran a series of tests and ended up with some record-breaking results.

In March, Dayou Qian, of NEC, reported a run in which he was able to transmit data at an unprecedented rate of 101.7 terabits per second across a distance of 165 kilometers. This was achieved by sending the information along as 370 separate lasers that was received as one light pulse at the other end.

While in Japan, Jun Sakaguchi of the National Institute of Information and Communications Technology in Tokyo one-upped that mark when his research team delivered data at 109 terabits per second. Their method involved creating a fiber super cable that had the capacity to transmit seven separate light pulses at once.

Both presented their research at the recent Optical Fiber Communications Conference in Los Angeles.

To give you an idea of how puts-lightning-to-shame fast 100 terabit transfers are, it would allow computers to download three months worth of HD video in about a second.

While impressive, New Scientist explains that both methods are still difficult to scale up:

Multi-core fibres are complex to make, as is amplifying signals for long-distance transmission in either technique. For this reason, Wang thinks the first application of 100-terabit transmission will be inside the giant data centres that power Google, Facebook and Amazon.

I guess the rest of us will have to put up with broadband bottlenecks for a while, but hopefully for not too long.

(via New Scientist)

Image: Wikimedia

Related on SmartPlanet:

Share this

Tuan Nguyen

Contributing Editor

Contributing Editor Tuan C. Nguyen is a freelance science journalist based in New York City. He has written for the U.S. News and World Report, Fox News, MSNBC, ABC News, AOL, Yahoo! News and LiveScience. Formerly, he was reporter and producer for the technology section of ABCNews.com. He holds degrees from the University of California Los Angeles and the City University of New York's Graduate School of Journalism. Follow him on Twitter. Disclosure