Wired Communications

Increase in data is driving change in data centers

Wired Communications

Everything is Data, Data is Everything

Tektronix’ Sarah Boen and David Akerson discuss the broad trends shaping the datacenter and wired communication market.

With the rapid proliferation of connected devices and sensors, we’re generating more data than ever before and this trend is expected to continue. In a world of exponential data growth, it raises some challenging questions: where do we store it, how do we get it where it needs to be when it’s needed, and how do we access it quickly to extract value from it?

The digitization and sensorization of data in an Internet of Things world has sparked an explosion in the amount of data created.

“As an example, when you take a flight, the amount of data that’s being transferred from all the sensors on that airplane is tremendous,” says Boen. “Boeing's ecoDemonstrator 787 uses more than 140,000 data points generating terabytes of data per flight. All of this data needs to get into the datacenter and be processed quickly, in real time.” With over 100,000 commercial flights every day, that adds up to a lot of data—and that’s just one narrow example of the massive volumes of information being generated every day."Airplanes generate massive amount of data.

“Over the last decade, the amount of data that datacenter managers have to comprehend has changed dramatically. In this timeframe, the industry has gone from talking about managing the flow of gigabytes of data to exabytes and zettabytes of data,” illustrates Akerson. “This data has a tremendous potential to make businesses operate more efficiently and profitably, and in some cases, provide new services to end users.”

To date, extracting the value of this data in real time has been somewhat limited by technology. Over the next decade, Tektronix predicts new technologies will emerge, allowing organizations to better capitalize on the potential this data provides.

Hot Data. Make Way for PCI Express

A second trend in datacenter is PCI Express’ emergence as a storage interface. This has been driven by PCI Express’ performance advantage in providing data faster with lower latency, significantly impacting an organizations efficiency and profitability.

PCIe implementation for storage increasing year over yearSource: IDC, 2017

“Five years ago, PCI Express storage wasn’t the predominant storage interface, but the performance advantages of the technology in terms of throughput and low latency, have accelerated and will continue to accelerate its adoption,” explains Akerson. “To illustrate why storage is getting greater attention, Amazon did a study and determined that every 100ms of latency costs it one percent in sales. Google determined that an extra 500ms in search page generation time dropped traffic by 20 percent. Another study showed that a broker could lose $4M per millisecond if his or her electronic trading platform is 5ms behind the competition. For these companies, legacy interfaces don’t cut it, prompting them to move to NAND based storage utilizing the PCI Express interface.”

 

The number of SSD-based devices connected to the network is increasing the amount of data transmitted and stored.

 

“Whether an organization is storing data in the cloud or locally, the challenge of delivering large amounts of data with low latency will continue to drive advancements in storage interface standards and the introduction of new technologies” says Boen. "Storage interfaces and standards, such as PCIe Express are advancing quickly with PCI Express Gen4 (16 GT/s) storage products in development and Gen 5 (32 GT/s) on the horizon. To keep pace, the SCSI Trade Association announced its 24G Serial Attached SCSI (SAS4) specification.”

"For manufacturers utilizing these technologies, there is still one consideration," says Boen. “While these products have tremendous performance advantages, these new technologies are impacted by the golden rules of the datacenter: datacenter operators want to keep everything low-cost, low-power. Those are key drivers for them.”

New Technologies, New Complexities

Along with changes in storage and server technology, network communication technologies are going through similarly dramatic changes to provide faster data transmission. With these changes, the level of complexity has, and will continue to increase.

comparison of NRZ to PAM4NRZ vs PAM4

Modulation schemes such as PAM4, are becoming a popular alternative to NRZ. Though PAM4 increases the transmission rate, it comes at the cost of added signal noise and testing. “The number of tests for somebody who’s developing a PAM4 transceiver is greater than 10x what it was with NRZ,” explains Akerson. “If you’re a company and you’re designing PAM4 transceiver, some of your biggest challenges are how do I get my product out fast, and how do I reduce my test-time costs?”

A key challenge facing manufacturers is also the uncertainty of which technology to use. “There’s a sheer lack of clarity on what technology is going to work, and so many are implementing multiple technologies,” Boen says. “If you look at roadmaps of different customers, you’ll see 28GBd PAM4 and also 56GBd NRZ, which is the traditional signaling methodology.

They’re doing both to hedge their bets, because they’re not exactly sure what’s going to work best, and the signaling technology might work better in one application versus another.” As Boen explains, there are a few ways manufacturers are trying to adapt to these complexities. “The first thing they’re doing is simulation upfront on how things work. This includes overclocking existing product to see how it’s going to run faster.”

Not Waiting for Standards

Another broad trend in datacenters and wired communication is who’s driving technology changes. “In the past, the datacenter interconnect was really driven by IEEE or OIF-CEI,” explains Boen. Now, she says, large datacenter operators like Amazon, Apple, Facebook, Google, and Microsoft are really pushing the technology forward and not necessarily waiting for standard to be completed.

“While the Amazons, Googles, and Facebooks of the world are active participants in standards bodies, they’re also driving in-house technology development and in some cases, defining technology standards,” says Boen. “They’re on the board of directors for these consortiums to drive the technology forward, but because standards bodies tend to move slow, what’s happening is Multi-Source Agreements are popping up where they’re founded by a group of companies that are driving technology evolution faster to meet their business needs.”

Smaller, Localized Datacenters

“New standards and interface technologies still won’t be enough to keep pace with the large data demands,” according to Boen. That’s why she predicts that datacenters will move closer to the sources of data, and get smaller as a result. This will be especially critical for so-called hot data—that is, data that needs to be acted upon quickly.

“Think about the automobile industry,” Boen says. “Smart cars, self-driving cars, the exchange of data is going to leverage the datacenter and the whole networking infrastructure. That’s hot data, you need to get quickly. So, I think we’re also going to see more smaller datacenters that are closer to the point of service.”Data generated by cars is collected in datacenters These smaller, localized datacenters will likely be linked with coherent optical technology. According to Boen, this will come with new testing challenges. “We think that we’re going to see more coherent optical technology within the datacenter,” she says. “On a transceiver, there’s an electrical interface that goes off a switch into an optical transceiver, and that might go away with optics directly on the switch to save power. I think we’re going to see different technologies, or maybe the same technologies but used in different ways. That’s going to change the ecosystem in terms of the players, and change the type of testing our customers are required to do.”

Measurement Technology in a Changing World

These broad trends contribute to new testing challenges that must be overcome. The interplay of new standards, approaches, and technologies makes measurement more important than ever before. “Design complexities are a lot greater,” says Akerson. “You have to run more tests to make sure your product is solid.”

“There’s a lot going on in this industry, and the challenge for us as a test vendor is to be prepared for that,” says Boen. “Our customers are looking for us to provide flexible test solutions that they can target at those different applications, without having to buy a dedicated instrument for each and every one. Those are some things that we’re working on for our future platforms.”

As datacenters and datacenter technology change, the methods we use to measure our designs must change as well. From new modulation schemes to new interconnect protocols, from expanding datacenters to optical links, to big data and the changes in how standards are developed, one aspect of wired communication remains constant: testing will continue to be a critical factor in the success of a project.

Find out more about applications in wired communications

Trends in other applications

Downloads
Download

Download Manuals, Datasheets, Software and more:

Go to top