Record Of 186 Gbps Internet Speed Set At SuperComputing 2011 Conference

At the SuperComputing 2011 (SC11) conference in Seattle, a team containing the combination of high-energy physicists, engineers, and computer scientists,#-Link-Snipped-#. They achieved this record by transferring data between the SuperComputing 2011 (SC11) convention in Seattle and the University of Victoria Computer Centre,  in opposite directions at a combined rate of 186 gigabits per second (Gbps) in a wide-area network circuit.

#-Link-Snipped-#The data transfer in opposite direction means, they transferred the data over a 100Gbps bidirectional fiber optic link, with  98Gbps in one direction and 88Gbps in the other, between the University of Victoria and the Caltech booth.

There is nothing new in this record as we have already seen 100Tbps (under laboratory conditions, which his very high cost and impractical to implement) and 26 Tbps (over a single optical fiber with just one laser, but did in a private network, under laboratory conditions) data transfer records already, until this new record achieved over a standard, commercially available 100Gbps link. It is highly anticipated that this method employed to achieve this 186Gbps data transfer rate could be used to build next generation of network technology allowing a data transfer rate of 40 Gbps to 100 Gbps.

"Our group and its partners are showing how massive amounts of data will be handled and transported in the future," says Harvey Newman, professor of physics and head of the high-energy physics (HEP) team. "Having these tools in our hands allows us to engage in realizable visions others do not have. We can see a clear path to a future others cannot yet imagine with any confidence."

This fast transfer rate is also crucial for continuing researches for understanding the nature of matter, space and time, and discovering new particles, through the tremendous amount of data coming from the Large Hadron Collider (LHC) at CERN. So far, More than 100 petabytes, an equivalent of 4 million Blu-Ray disks, of data has been processed, distributed, and analyzed and, this amount of data is expected to get increased by 1000 fold as physicists crank up the collision rates and energies at the LHC. The team also assures that the implementation of this network technology needs only a couple of years, which would kill the data transfer crisis arising these days, due to an increased population around the internet. The below video demonstrates the 100G network between the University of Victoria and the Caltech booth.

Replies

You are reading an archived discussion.

Related Posts

NVIDIA, the leading graphics card manufacturer is opening up the CUDA platform for software developers and researchers by releasing the NVIDIA® CUDA®LLVM-based compiler source code. The company wants to enable...
Christmas is fast approaching and Ford is all out for delivering high efficiency products for the New Year. The Global auto company is getting ready for starting the production of...
A solar flare is a sudden brightening observed over the Sun surface. This is a large release of  energy measuring up to 6 × 1025 joules of energy which is...
Google Map Maker is used by people to add & update information on google maps about places they know like restaurants, school/college campuses or your favorite coffee ship. This information...
The Higgs Boson particle that's the cement and water of many theories, may actually exist, as hinted by the result of Large Hadron Collider (LHC) yesterday. Hyped as The God...