Networking Archives | TierPoint, LLC Power Your Digital Breakaway. We are security-focused, cloud-forward, and data center-strong, a champion for untangling the hybrid complexity of modern IT, so you can free up resources to innovate, exceed customer expectations, and drive revenue. Wed, 07 Feb 2024 19:29:45 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.5 https://www.tierpoint.com/wp-content/uploads/2022/05/cropped-TierPoint_Logo-1-150x150.png Networking Archives | TierPoint, LLC 32 32 How and Why Software-Defined Networking (SDN) Works https://www.tierpoint.com/blog/how-and-why-software-defined-networking-works/ Tue, 06 Aug 2019 21:05:16 +0000 https://tierpointdev.wpengine.com/blog/how-and-why-software-defined-networking-works/ Each day, petabytes of data move between data centers, colocation facilities, cloud providers and end users. The speed that the data travels is critical to application performance and to the end user’s experience. However, IT managers are often challenged to find the perfect balance between network performance and affordability. Using the public internet is cheap but often slow. A high-bandwidth, dedicated connection offers speed but at a high cost. Enter the software-defined network (SDN), a recent approach to networking that has the potential to provide both speed and affordability.

With SDN, network services are separated from network hardware, enabling IT organizations to create programmable networks irrespective of the underlying equipment.

In the July 17 webcast What’s Hot in Software-defined Networking: Solutions for On-demand Connectivity, TierPoint’s Dominic Romeo, director of product management, along with Jay Turner, vice president of development and operations for PCCW Global and Jonathan Rubin, an assistant vice president at PCCW Global, discussed the challenges that network administrators face today and why many are adopting SDN for their networks.

Why Many are Turning to Software-Defined Networking

The underlying trend that is driving SDN adoption is the rapid rise in data and network traffic, which is straining both public and private networks. This rise in traffic is driven by several factors, including mobile computing, Internet smart devices, increased video content and the widespread adoption of cloud computing.

“Traffic volumes are increasing at a ridiculous rate.  I think everyone has ten devices that have some sort of compute ability,” said Turner.

According to reports by Cisco, more than 28 billion devices and connections will be online by 2022 and  video will be 82% of all IP traffic. The cloud is a big contributor, with Cisco forecasting that global cloud data center traffic will reach 19.5 zettabytes (ZB) per year by 2021 and contribute 95% of all data center traffic.

The traditional method for increasing bandwidth is to call the local telecom provider to add a new connection. However, that means paying for bandwidth that may not be needed much of the time.

“You open a new office or expand an office and you need either a new or larger access circuit,” said Turner. “That means a two- or three-year contract for full capacity, and it can take weeks to get the circuit provisioned. “

What are Software-Defined Networks?

SDN is an alternative to those traditional methods that can potentially offer both high performance and affordability. SDN separates the provisioning and routing intelligence from the network hardware and abstracts it into software—the SDN controller. SDN includes the ability to detect packet loss, latency and congestion rates, and make real-time decisions on the best route for different types of data.

What is SD-WAN? An organization can also use SDN on its wide-area network (SD-WAN) to create, provision and manage network connections over its existing hardware, whether that includes connections, the public internet or even mobile networks.

Because SDN can create connections on demand, it’s useful for disaster recovery. In case of an outage, the network manager can spin up a backup network using any available network infrastructure, such as the public internet.

“It doesn’t really matter what that bottom layer is, because SDN comes in on top of it,” explained Turner. “Rather than having to pay for two access circuits, the SDN gives you the ability to provision that redundant circuit on demand.”

SDN may offer big benefits to users of cloud-based applications which depend on high-bandwidth connections to operate.

“Latency will make your SaaS application inoperable,” noted Rubin.

Data center and colocation providers have begun offering SDN products and services. An example is PCCW Global’s SDN software platform, Console Connect, which is sold through PCCW Global’s data center partners. It’s those data centers, including TierPoint’s 40-plus colocation facilities, that provide the physical connections to Tier 1 and 2 networks and to cloud providers such as Amazon, Google, Alibaba, Tencent and Microsoft Azure.

Top three SDN and SD-WAN features to consider

If you’re considering a SDN, there are three features that experts recommend.

  • Connections to multiple providers. Rubin recommends getting a SDN-WAN provider that can offer connections to Tier 1 and 2 network providers as well as to multiple cloud providers. Most organizations use multiple cloud services from different cloud vendors, so it makes sense to opt for as many cloud connections as possible. Also consider data center and SDN service that supports connections to a wide range of cloud providers.
  • Route and latency options. A new trend in SDN is the ability to see the expected latency for a potential SDN route before selecting it. “If you need a connection from Johannesburg to London, you’ll be presented with the different potential routes and the latencies for those routes,” said Rubin.
  • User-friendly, feature-rich interface. The user interface is an important consideration, noted Romeo. You want one that makes it easy to view and manage the end-to-end network, including network APIs and route options, and to automate policies and tasks such as creating circuits at specific times.

SDN can help IT departments struggling to meet bandwidth needs and stay within budget by enabling them to create on-demand bandwidth and optimize existing infrastructure. It can also enable administrators to bring up new connections faster.

“SDN can make an IT manager’s life easier,” said Rubin.

More on Software-Defined Networking

For more information on the how and why of SDN, watch the webcast What’s Hot in Software-defined Networking and download “The Essential Hybrid Networking Guide.”

]]>
How Will 5G Affect Edge Computing? https://www.tierpoint.com/blog/how-will-5g-affect-edge-computing/ Thu, 18 Jul 2019 16:57:28 +0000 https://tierpointdev.wpengine.com/blog/how-will-5g-affect-edge-computing/ In the final part of our interview on edge computing with Dominic Romeo, TierPoint Director of Product Management, Dom addresses the advent of 5G networks and how they will affect edge computing and edge data centers.

Read the full series:

Does 5G help or hurt Edge Computing adoption?

Interviewer: Last time we spoke, we talked about how data center operators can prepare for the coming of 5G. A lot of people, myself included, think about reduced latency when we think about the impact of 5G. When 5G becomes broadly available, do you think it will lessen the need for edge computing?

Romeo: I think it will accelerate it. More than just speed, 5G is also about capacity. Sometimes those get confused with each other, so to clarify, let’s revisit the analogy of a car on the highway. If 5G is that car going 100 miles an hour, then 4G is like that car going 90 miles per hour or maybe even 80 miles per hour.

So, there is a difference in speed, but if 4G is like a four-lane highway, 5G is like a 16-lane highway. That’s because, with 5G, we’re also opening up new lanes by leveraging different parts of the electromagnetic spectrum. In the past, all of our cell phones ran on 700 MHz, 900 MHz, or 1600 MHz. 5G opens up whole new frequencies like 24 GHz and 60 GHz.

There’s a reason we never used these frequencies in the past. They only work over short distances, they don’t penetrate trees, and they don’t penetrate walls. But in urban cores or in dense suburban areas, new high-density 5G footprints will let more users connect at higher speeds simultaneously.

Think about it this way ­– instead of five or six people connecting to one cell phone tower streaming Game of Thrones, now you can have 600 people connecting to one cell phone tower all watching at the same time. That is going to open up more access to data and to services.

Executives Guide to Edge Computing [white paper]

How 5G will impact businesses beyond common consumer uses?

Interviewer: But how will that impact businesses and not just those binge-watching Netflix?

Romeo: Let’s stick with streaming services for a moment since they’re so easy for people to relate to. Between Netflix and Hulu and Disney and such, we think we live in a world crowded with services. But, I fully anticipate more video streaming services to enter the market because the technologies are readily available to create a really great user experience for very little capital outlay.

In the past, if you wanted to start up a Netflix or Hulu, you had to have a vision as well as the budget and technological expertise. But today, you could buy virtual resources from TierPoint, on-demand network resources from Megaport, and a cloud backend from Azure. You have practically no capital outlay on any of that. You’re just paying monthly bills as you go for the building blocks you need to build a next-generation streaming service.

Video streaming is just one example. It’s probably a difficult one because video streaming involves licensing and such that can get really expensive and complex. There are far more easily executable innovations waiting in the wings, and I think we will be astounded at the way entrepreneurs take new technologies to create products and services with an amazing user experience now that 3G and 4G – not to mention the capital needed – are no longer limiting factors.

That said, developers have always pushed the envelope of what our hardware can do and probably always will. That laptop with the blazing-fast speed you bought five years ago, now runs like molasses. That’s not necessarily because the equipment is wearing out, so much as it is because the applications you’re running demand so much more.

So as businesses create new products and services that leverage 5G, they’re going to continue to be concerned with decreasing latency to the lowest level possible. I believe that the increased number of users accessing those systems as well as the sophistication of the applications will actually drive latency concerns. Edge data centers will continue to be a way to combat the issue.

Also read: Connectivity is Key to Powering Your Multicloud Strategy

Considering Edge Computing?

You need the right data center in the right place to power your Edge applications. A provider, like us, with high-speed connectivity, interconnectivity with their other data centers and a range of connectivity options will help you achieve your goals. Contact us today to learn more.

New call-to-action

]]>
Latency Takes Center Stage in Healthcare Cloud Discussions https://www.tierpoint.com/blog/latency-takes-center-stage-in-healthcare-cloud-discussions/ Fri, 08 Mar 2019 15:00:00 +0000 https://tierpointdev.wpengine.com/blog/latency-takes-center-stage-in-healthcare-cloud-discussions/ When it comes to choosing a cloud environment, data security has always been one of the top concerns of business and healthcare IT leaders. We expect it to remain near the top in 2019 and beyond, but its position as #1 may be challenged by a relative newcomer to the list: latency.

Response Time Becomes a Critical Factor

We don’t foresee a time when security isn’t a key priority. In fact, cyber threats are greater than ever, but so is our ability to combat them. With one of our toughest cloud challenges mitigated (although far from neutralized), we can now put more of a focus on other priorities. Latency is one of these that is surfacing in our conversations with healthcare IT professionals.

We’re sharing some highlights from a Forrester Research to write a report outlining some of their predictions for healthcare in 2019. At a high level, Forrester predicts that virtual care, AI (artificial intelligence), and new CX (customer satisfaction) metrics will become high priorities. Every one of these predictions fuels our belief that latency will be a top challenge for healthcare IT execs to resolve. Let’s look at each of these in turn:

Virtual care

Just 14% of doctors and 11% of patients feel they have an adequate amount of time together. (The Physicians Foundation) Healthcare providers of every category are under constant pressure to see more patients in a day, while still providing the same quality of care and outcomes. Virtual care, the ability to provide care via technology, offers great promise. For example, a physician might monitor the vital signs of a patient with a chronic illness using wearable technologies that transmit data to the physician’s office, replacing the many routine trips to the clinic that patients with chronic diseases often need to make.

Though this will allow the physician to take on a greater patient load, their time will still be at a premium. Waiting for data to load every time they want to review a patient’s vital signs won’t be acceptable. For that matter, if there is a problem, delays can be life threatening. In the case of an incident like a stroke or atrial fibrillation, the extra thirty seconds it takes an alert to reach the doctor due to network congestion could mean the difference between life and death.

Artificial intelligence

AI is a core part of the virtual-care equation. Let’s say you’re a new father at home with a sick baby. Prior to virtual care, you typically called a nurse’s hotline and spoke to someone with a general understanding of childhood illnesses. It won’t be long, though, before you’ll be able to speak to an automated assistant so human-like that you might not be able to tell the difference.

This automated assistant will be able to analyze additional data that a nurse typically wouldn’t have in front of them such as knowledge of a recent outbreak of strep at a nearby daycare center or the family’s history of asthma. The automated assistant will use this data plus your answers to their questions to adjust the flow of the conversation.

To be sure, analyzing data like this will take a level of data interoperability that just isn’t there yet, but the industry is working on it. When it is available, it will also require almost instantaneous response times. To have a conversation that feels natural to the caller, this data will need to be gathered and micro-analyzed in a split second. Latency is not an option.

Customer Experience

There’s already been an element of the customer experience in both of our previous examples. In the virtual care example, the patient isn’t going to want to wait around for data to load any more than the physician is. In the AI example, the new father doesn’t want to hear thirty seconds of silence while the bot he’s talking to downloads data.

Beyond that, there’s the issue of day-to-day online interactions through patient portals and other sites. We’ve become an instant-gratification society, and if it takes longer than a millisecond for your site to load, customer satisfaction suffers.

Edge Computing Reduces Latency

Several factors impact latency, including internet and network traffic congestion, and optimization of your network resources. One of the best ways to reduce latency, however, is to deploy your workloads at the edge, i.e., as close to your end users as possible, instead of in a central data center that might be hundreds if not thousands of miles away.

Dominic Romeo, senior product manager at TierPoint, wrote a comprehensive post last year on edge data centers – Edge Data Centers: Keeping Up With Consumers and IT.  TierPoint’s President and CFO, Mary Meduski, also participated in a panel discussion at SXSW called How Connectivity Will Control Everything We Know. The panelists tackled big topics like the future of IT infrastructure and the edge of the network and cloud. Check out the recording of the full session.

Edge computing is one of the reasons we maintain over 40 data centers located in communities across the United States. Sure, we have data centers in major metro areas like Chicago and New York, but we also have data centers in smaller cities like Little Rock, Omaha, and Spokane. You can see the full list and arrange a tour of any of our data centers on our website.

Read the full healthcare trends report

Finally, be sure to download the Forrester report Predictions 2019: Healthcare for insights on the challenges and opportunities healthcare IT professionals  may see in 2019.

]]>
Next-Gen Firewalls Provide Advanced Cybersecurity Protection https://www.tierpoint.com/blog/next-gen-firewalls-provide-advanced-cybersecurity-protection/ Fri, 07 Dec 2018 19:22:35 +0000 https://tierpointdev.wpengine.com/blog/next-gen-firewalls-provide-advanced-cybersecurity-protection/ Most companies today have some sort of firewall to protect their data and applications from network security threats. But traditional firewalls no longer provide sufficient protection against today’s increasingly sophisticated cybersecurity threats. Instead, many IT departments are replacing them with next-generation (NG) firewalls, which contain a more advanced array of defensive technologies and can safeguard a network from most types of cyber attacks.

Unlike traditional firewalls which provide basic packet and URL filtering, Next-Generation firewalls have multiple security features such as network intrusion detection, malware filtering, website blocking, and web application protection. For the small- to mid-sized business that lacks the resources for an enterprise security solution, a Next-Generation firewall can provide all-in-one cybersecurity protection. For large enterprises, Next-Generation firewalls are valuable components of a comprehensive cybersecurity solution.

Unfortunately, some IT professionals fear that Next-Generation firewalls are too feature-rich for their needs or too sophisticated to deploy easily.  Instead, they make do with a traditional firewall or create a piecemeal solution out of standalone cybersecurity products.

That is a mistake, say cybersecurity experts Bob Pruett and Vincent Delbar.

The next step in the firewall services evolution

Pruett, field security solutions executive at SHI and Delbar, technical partner manager at Fortinet, spoke on Best Practices when Deploying Next Generation Firewalls. The webcast, moderated by Darren Carroll, director of products at TierPoint, explained the features of Next-Generation firewalls and the best practices for implementing them.

Today’s Next-Generation firewalls are easier to deploy and configure than earlier versions from several years ago. In addition, most provide the ability to activate the different security features as needed, so an organization can start with basic traffic monitoring and add capabilities when ready.

“For instance, once you see what kinds of web sites people are going to, you can start blocking certain categories or limiting certain kinds of applications,” explained Delbar.

Another benefit is the ability to monitor and manage all the cybersecurity features from one interface. That saves time and provides greater visibility into the overall threat status of the organization.

Next-Generation firewalls do all the things that traditional firewalls do–packet filtering, network and port address translation, URL filtering and stateful inspections—along with other, more advanced capabilities. These include:

  • integrated intrusion detection and protection to identify and block attacks based on behavioral analysis or threat signatures.
  • application awareness, which provides the ability to set policies that block ports or services on an application by application basis.
  • identity awareness, which enables IT to manage users, groups and applications through customized, identity-based policies
  • anti-malware protection, so that malware can be detected and blocked before it can enter the network

New call-to-action

Watch the webinar,
Harnessing Artificial Intelligence & Emerging Technologies for Data Security”,
to learn more about the next frontiers in attack mitigation.

Tools for next-gen firewall success

An example of an next-generation firewall is TierPoint’s CleanIP which has all the features above as well as several others. These include web application vulnerability patching and DDoS mitigation; content filtering to block web pages and e-mails that violate company policy; support for VPNs with multi-factor authentication, SSL inspection of encrypted content; regularly updated threat intelligence for IP reputation and malware signature filtering.

The ability to inspect encrypted content will be increasingly critical. Analysts estimate that 70 percent of malware will be encrypted by 2020. A firewall that lacks the ability to analyze encrypted traffic will soon be unable to detect the majority of malware.

Likewise, regularly updated threat intelligence for IP reputation and malware signature filtering is important as attackers routinely change their attacks to make them harder to detect. These new “zero day” attacks can only be identified and blocked by firewalls that are continuously updated with the latest threat signatures.

The bottom line is that most organizations would benefit from an Next-Generation firewall, which can fend off multiple types of cybersecurity threats and can be managed and monitored thorough a single interface. It’s a far easier solution than a piecemeal collection of standalone security products.

Best Practices when Deploying Next Generation Firewalls_webinarWatch our recent webinar, “Best Practices when Deploying Next Generation Firewalls”, with TierPoint’s Darren Carroll, SHI’s Bob Pruett, and Fortinet’s Vincent Delbar to learn more.

]]>
How 5G Will Impact Businesses & Their Customers https://www.tierpoint.com/blog/how-5g-will-impact-businesses-and-their-customers/ Thu, 01 Nov 2018 18:12:55 +0000 https://tierpointdev.wpengine.com/blog/how-5g-will-impact-businesses-and-their-customers/ In the final part (part 3) of our interview with Dominic Romeo, Senior Product Manager at TierPoint, we raise the discussion up from the technical aspects of 5G to look at how it will impact businesses and their ability to serve their customers better.

Interested in reading more of our 5G Q&A?

How 5G will push businesses to adapt

Interviewer: So it sounds like 5G is going to have quite an impact, even though it’ll be different than the impact the evolution from 3G to 4G had. What level of effort do you think businesses will need to make to take advantage of 5G?

Dominic: That’s a good question because there are really two kinds of businesses: those who deploy 5G in their operations and those who benefit from 5G in the ecosystem. I’d like to focus on the latter group because they don’t have to do much of anything, yet the benefits will be tremendous.

Any business that communicates with end users through technology will immediately see a benefit when 5G is fully deployed. You’ll have the latency decreases; the speed increases. Web browsers will respond faster. Emails will download quicker. That’s just native in the 5G world.

Interviewer: Native, perhaps, but not to be underestimated. I believe Google did some research earlier this year that said the average mobile web page was taking more than 20 seconds to load, but people tend to abandon pages if they don’t load within three seconds.

Dominic: That’s right, and the business won’t have to do much of anything to take advantage of the increase in speed. Even a poorly designed website should be faster than it is today.

But to stay competitive, businesses will be compelled to evolve their go-to-customer strategies. Right now, we’re evolving from computer-driven interactions with our customers to mobile interactions. That is, we develop our websites and user interfaces as though the default interface is a PC browser screen, and then we ensure that the interface also works well across mobile devices. Or at least we should, but as we all know, some companies are better at it than others.

This approach is going to have to evolve to a mobile-first strategy. There are many, many end-users in the world, billions really, that will never use a computer to interact with your website. They will only use a mobile device. It’s been a couple of years since I saw the research, but as I recall, it said something like a quarter of millennials only use mobile devices. I would not doubt that percentage has continued to rise.

Interviewer: No doubt. They call the generation that came after the millennials the iGeneration for good reason. My daughter is in this group, and she rarely uses her laptop for anything. It’s all tablets and smartphones. It won’t be long before these kids are in the workforce and become a core market for goods and services.

Dominic: Right, so businesses will need to make sure they’re delivering an experience that is just as good, just as fast on the six-inch mobile device in the customer’s hand as it would be if they drove home and ordered on Amazon from their laptop. Instead of going from computer-first to mobile-first, our thinking is going to have to evolve to mobile-only. 5G will enable that evolution.

We’re getting there already. Even businesses that are late to the mobile-first party look at the traffic generated on their website and see that seventy to eighty percent of it is coming from mobile devices. As they watch that climb to ninety or ninety-five percent, it’s going to drive the remaining laggards to mobile-only.

Interviewer: There’s an internal side to this as well isn’t there? We’re talking about the end-user as the customer, but millions of businesses have salespeople and service personnel interacting with customers in the field, using only their mobile devices to connect back to the office systems.

Dominic: I look at these people as consumers as well. If you’re a road warrior, you need VPN to get back into the office. A lot of us who are working on a laptop from the road have to go find a WiFi hotspot with a good enough connection to allow us to get our work done reasonably quickly. That’s not always easy, and it can introduce security risks, although that’s probably a topic for another time. Once 5G is out there and running in the wild, we’ll finally be able to realize the concept of a truly mobile office.

Considering 5G in IT planning

Interviewer: When the head of IT looks to the future, what kinds of things should they be thinking about? For example, are there going to be infrastructure changes they’ll need to make once 5G devices become the norm?

Dominic: The big issue is going to be connectivity. More and faster connectivity. Right now, if you think about it, we’re a little bit “edge constrained.” You can see it in the conversations on throttling. For the most part, this was what the net neutrality discussion was all about. The increasing number of unlimited data plans offered by mobile carriers also plays a role in increasing the traffic on the net. The fact of the matter is, with today’s networks, you can’t have a hundred thousand customers streaming Netflix, or YouTube or high-bandwidth applications because there isn’t enough bandwidth there. There aren’t enough channels. There isn’t enough throughput.

But when that barrier is broken down, and 5G is connecting our end-users at increased throughput, increased densities, then all of a sudden the links from those cell towers back to the data centers or from those carriers networks back to cloud providers, those are going to get flooded with even more traffic than they have now.

I think we’re all very aware that network traffic is growing at an incredible pace thanks to our love of technology. This seems to me to be just the tip of the iceberg. If data centers don’t have the connectivity they need to service these denser, higher throughput connections, they’re going to get quickly overwhelmed.

TierPoint can help you prepare for 5G

Managing your data and applications in the cloud can be a daunting task. Digital landscape changes, like evolving customer behavior due to 5G and security threats, can leave businesses asking themselves “What’s next?” A cloud partner can help you answer that question. At TierPoint, we have experts who know cloud and network services, understand your business needs, and guide you to the right IT infrastructure for success. Contact us today to learn how we can help you in your journey to IT transformation.

]]>
The Impact of 5G: Speeds and Deployment https://www.tierpoint.com/blog/the-impact-of-5g-speeds-and-deployment/ Tue, 16 Oct 2018 20:28:12 +0000 https://tierpointdev.wpengine.com/blog/the-impact-of-5g-speeds-and-deployment/ In part 2 of our interview with Dominic Romeo, Senior Product Manager at TierPoint, we go deeper into the nuts and bolts of 5G and where the greatest impacts will be for end users.

If you missed part 1 of our interview, where we talked to Dominic about the evolution to 5G by comparing it to the jump from 3G to 4G, you can access that here.

How Fast Will 5G Get Here?

Interviewer: Dominic, you also mentioned that you see 4G and 5G coexisting for a while. How long do you think it’ll take us to reach a fully 5G future?

Dominic: That’s the question everyone wants answered: “What will 5G get me, and how fast will it be here?” I don’t think anyone knows for sure. It has a lot to do with the end-user devices: your smartphones, your tablets, your handheld devices, even your laptops. When those devices start to get 5G chipsets and antennas, that’s when we’ll be really ready to leverage 5G.

Different definitions of 5G

Interviewer: Aren’t there carriers that claim to have a 5G network already in place?

Dominic: Yes, but from the consumer perspective, what device can I buy that can take advantage of a true 5G network? Nothing right now. Apple just introduced its new iPhone. It’s not 5G enabled. From what I’ve seen of the new Google phone coming out, it’s not 5G enabled either.

But that doesn’t mean the carrier’s claims are irrelevant. First, you have to understand that 5G standards are just being developed, and what the standards board or the diehard 5G engineer calls “5G” may be different than what a carrier, trying to serve its customers better and strengthen its market share, calls 5G. The carriers are evolving their networks from 4G to 5G to get ready. Some of these enhancements will translate into improvements today, but some won’t be realized until the devices catch up.

5G Bandwidth Potential

Interviewer: Can we get specific about how much of an impact 5G will have? We can talk more about the opportunities it will open up in a moment, but I think some of our readers would love to know exactly what level of improvement we’re talking about?

Dominic: 3G to 4G was a huge speed increase. You could get somewhere in the neighborhood of 3 or 4Mbps maximum on your 3G device. Keep in mind that’s useable throughput, not the theoretical numbers or the number the carrier advertises. Nowadays on my 4G device, I get an average of about 30Mbs per second. That’s as much as a tenfold increase in bandwidth over 3G.

When you make this sort of jump, some interesting things start to happen. Back when we had 3G, web pages didn’t have a lot of content. They were static, electronic versions of the company’s brochure. But as soon as 4G came around with its tenfold increase in bandwidth, it enabled probably a hundredfold increase in content consumption. We didn’t hear a lot about video marketing and YouTube as a way of reaching customers until we had 4G.

The theoretical maximums for 4G are in the neighborhood of 1Gbps. As we reach the latter stages of the 4G evolution, the underpinning technologies are well-developed, and we see actual speeds in the neighborhood of 100Mbps.

So, 4G to 5G will also provide a bump in bandwidth, but it won’t be as significant as the bump from first iterations of 3G to later iterations of 4G. We’ve kind of maxed out the theoretical limits of what the traditional cellular airwaves can carry for us, so we need to invest in different ways of connecting. One of the interesting things that 5G does is that it codifies the concept of small cells.

For example, today, you might have one cell tower in downtown Omaha, Nebraska that covers 10,000 users or more. With small cells, you could have a cell per high-rise building or a cell per square block versus square mile. When you do that you can offer much greater user density, and by doing that you have fewer users connecting to a greater number of antennas, so then you can offer greater bandwidth. Users are not competing as heavily for their share of airtime.

This concept is built into the 5G-NR portion of the specifications that the 3GPP standards body is creating, and it is going to make a huge difference in high-traffic, high-density environments like urban cores and even suburbs to a certain extent. Don’t get me wrong, the change in bandwidth will be there, but it’s not going to be a 100X improvement over 4G. It might even be just a 2X or 5X bump in speed, at first. But the important takeaway is that 5G is giving different tools to these network operators, the tools they need to change the rules of the wireless game.

While we’re talking about this, I should mention that they’re also trying to use different parts of the wireless spectrum. You might have seen something about the FCC auctioning off wireless spectrum blocks. One of the reasons we have high-definition, over-the-air TV today, is because the FCC freed up some of the bandwidth owned by old analog TV stations from forty or sixty years ago so it could be used by wireless providers.

5G Opens Up Unlicensed Spectrum

Interviewer: We’re getting into the weeds on 5G, but I think a good portion of our readers will find this really fascinating. Can you elaborate on how 5G will help us get more from the available spectrum?

Dominic: Spectrum is a limited commodity. You can only have a certain number of signals on a set of frequencies before they become saturated. It’s kind of like trying to force extra water down a hose. Without increasing the size of the hose or increasing the number of hoses, you’re not going to get any more water out the other end.

Another feature of 5G is the ability to use unlicensed spectrum. There are higher frequencies that we don’t really use for communications today because they’re great for short distances, but signals fall off quickly over longer distances. In the past, when our cell networks consisted of monolithic towers miles away from our receivers, these frequencies just wouldn’t have the energy to transmit that far. With the move to small cells, the distances decrease, and that makes these higher frequencies useable; this also translates to higher bandwidth. To go back to the hose analogy, adding the higher frequencies is like both increasing the size of the hose and adding more hoses at the same time.

I think you can see why only focusing on 5G speeds might leave you a little underwhelmed. You need to look at the whole picture. We’re opening up new avenues for 5G to communicate in denser areas and on new spectrum, and this absolutely will be a game changer.

More on 5G

In this blog post Q&A series, we’ll also be looking at the deployment of the network technology behind 5G, the business impact of 5G and edge computing. Stay tuned for the final part of the Q&A.

> Interested in reading more about the future of the data center? Read our Q&A blog post on the Software-Defined Data Center of the Future.

]]>
Preparing the Data Center for a 5G Future https://www.tierpoint.com/blog/preparing-the-data-center-for-a-5g-future/ Tue, 09 Oct 2018 16:11:30 +0000 https://tierpointdev.wpengine.com/blog/preparing-the-data-center-for-a-5g-future/ If you read the trade press or IT blogs, you’ve probably read a lot about 5G and how, once it becomes available, the world will never be the same again. On the other hand, some talk about 5G as though it were just the next step up from the 4G world in which we live. So, which is it: evolution or revolution? And, what does it mean for the data center?

Dominic RomeoTo find out, we sat down with Dominic Romeo, Senior Product Manager at TierPoint to see how he views 5G and whether it’ll be a “game changer” for his customers and TierPoint  data centers. In the first part of the interview, Dominic sets the stage by talking about the impact of 5G, comparing it to the previous evolution from 3G to 4G.

The Potential of 5G

Interviewer: Dominic, we read a lot about how 5G is going to enable things like the autonomous vehicle revolution, and that’s great, but most of TierPoint’s customers aren’t in that business. Can you start us off by putting 5G’s potential into more practical terms?

Dominic: I like to equate it to the two waves of adoption. In the early days of any technology, you get the people who are the early adopters; they’re waiting on the technology, and they’re eager to use it because they know what kind of benefits it’s going to bring. Then, if the technology is really successful, really impactful, there will be a second wave of adoption in other market areas that never even knew they needed 5G. Now, suddenly, everything about their application and end-user experience is better because they have 5G.

Now, for everyone who’s wondering how much better the experience is going to be, the best takeaway I can give is that it’s a little more meaningful than what we saw in the 3G to 4G migration. That was primarily a bump in speed, but there was an evolution in technology, too. I won’t get into all the geek speak because it doesn’t mean anything to the end user. All they knew is that when they bought an iPhone 5 to replace their iPhone 3, the internet was way faster, and they could finally stream videos.

Before we go into how the evolution to 5G is different, I should add that 4G continues to evolve, providing faster and faster speeds for end-users, and I expect it to coexist with 5G for a very long time.

Interviewer: So, the average end-user doesn’t need to be worried about whether the iPhone they just invested in is going to become obsolete once 5G really hits the market?

Dominic: Absolutely not. Eventually, developers will catch up to 5G and develop applications that require 5G connectivity. For example, immersive virtual reality, whether it’s for gaming or a simulated training experience in business, is probably going to require 5G. And there are some experiences, like live-streaming a concert or a professional sporting event, that are going to be made immeasurably better by 5G, but the average user will be able to get by on 4G for quite some time, probably longer than the average lifespan of whatever device they’re holding in their hand right now.

The Evolution from 4G to 5G is user-centric

Interviewer: You said evolution to 5G will be different than the evolution from 3G to 4G. Let’s go back to that. What makes this evolution different?

Dominic: 4G to 5G is a very end-user centric step. We all know engineers can get lost in their love of technology. They can create new versions that are technologically awesome but do nothing for the end user. I think we’ve all experienced that at one time or another. We upgrade to a new version of an application that works very differently from the previous version, for no apparent benefit.

I see the biggest difference with 4G to 5G is that the smart people who developed these technologies have paid attention to more than just how much more bandwidth they can get. They’ve asked themselves, “What do we really need to do to create a more positive user experience?” So, they focused on application delivery, and they focused on lowering latency.

     >Also read: Connectivity Is Critical in Hybrid Cloud Environments

5G lowers latency, improving end-user experiences

Interviewer: Let’s drill down on that for a moment. What does latency mean for the average end user?

Dominic: Latency is that time between when you click on something and when that content is delivered back to you. It’s that time when you click on your Netflix link and you have 5 to 10 seconds of buffering before you can watch an episode of your favorite show or a movie.

Focusing on latency and application delivery is going to be transformative for wireless networks. It’s the lower latencies that people talk about when they talk about autonomous vehicles. The lower you can drive these latencies, the more responsive the vehicle can be to conditions. Same thing for an application running remotely-controlled autonomous warehouses. And, of course, we want this lower latency when we’re playing a multi-player video game that requires regular downloads of additional data from the server or when live streaming a football game.

These are just examples that I think everyone can relate to today, but there will most certainly be applications of 5G that those of us who aren’t focused on application development have never dreamed of. A few years from now, we may suddenly find ourselves thinking back on those animated buffering symbols and wondering, “how did we ever get by?” 

More on 5G

In this blog post Q&A series, we’ll also be looking at the deployment of the network technology behind 5G, the business impact of 5G and edge computing. Stay tuned for part II of the Q&A.

> Interested in reading more about the future of the data center? Read our Q&A blog post on the Software-Defined Data Center of the Future.

]]>
BraveIT Sponsor Spotlight: The Data Center Evolution https://www.tierpoint.com/blog/the-data-center-evolution/ Tue, 21 Aug 2018 17:17:26 +0000 https://tierpointdev.wpengine.com/blog/the-data-center-evolution/ If Darwin was writing about data centers he’d say that the abodes of compute and storage are going through a phase of rapid evolution, and it’s this evolution that is driving the massive degree of projected growth in the coming years.

Although enterprise companies are continually advised to give up their data centers and go to the cloud, the data continues to indicate that there are some applications that may be “cloud worthy,” but deemed too crucial to reside outside of in-house facilities. Of course, this doesn’t make the enterprise facility the Neanderthal man of the data center family tree, but it does make them more resistant to change. However, a recent study by the HIS Market research firm reported that a large number of companies intend to double the number of servers in their data centers in 2019, and we all know that they have to put those things somewhere.


Learn more about the data center of the future at Brave IT

The sensitive nature of the information gathered and housed within enterprise data centers has driven many firms to adopt a bifurcated approach to the cloud that preserves the need for “company-owned” data centers and their continued expansion. These organizations continue to hedge their bets in cloud adoption by migrating applications with lower security needs to the public cloud while using new and existing capacity across their facilities to leverage the benefits of cloud computing within a private context. Although the pendulum will continue to swing toward public offerings, technological improvements in areas like AI will offer attractive enterprise migration paths due to enhanced functionality in areas such as network capacity and performance as well as, superior security capabilities.

During a recent BisNow Data Center Investment Conference, a consensus coalesced around the opinion that over 4,000 new data centers will be required by 2020. As might be expected, the vast majority of these new sites will be devoted to addressing the seemingly insatiable capacity requirements of the cloud and its leading providers. The hyperscale nature of cloud growth shows no sign of abatement when we consider the continued advancements in the areas of technologies like IoT, IIoT, VR, AR and AI. The rapidity of technological innovation has forced many data center providers to adapt their business model from the anticipatory, “build it, and they will come,” to one that emphasizes scale and speed of delivery to attempt to maintain pace with demand.

Also Read: Big Data, BI and AI Driving Cloud Adoption

Although edge data centers will be the vehicle to bring cloud functionality closer to end users, a great deal of ambiguity surrounds the concept. Interestingly, unlike previous data center holy wars surrounding the definition of things like retail v. wholesale and what is a modular data center, the answers to any edge related questions may be even less easily resolved since even the concept of what constitutes a “data center” will increasingly evolve as the traditional functions of the facilities continue to move closer to the end user.

While estimates point to a healthy future for data centers, the nature of their users is shifting. Enterprise facilities, for example, aren’t going away but their share of market is declining relative to the cloud, and soon, the edge, but the size of the overall data center “pie” continues to grow. In the future, the operative question will not be whether the current rate of data center demand persists, but how its distribution amongst its component entities continues to evolve along its own Darwinian path.

]]>
Edge Data Centers: Keeping Up With Consumers and IoT https://www.tierpoint.com/blog/edge-data-centers-consumers-iot/ Fri, 30 Mar 2018 00:25:33 +0000 https://tierpointdev.wpengine.com/blog/edge-data-centers-consumers-iot/ Edge data centers are delivering the network edge computing and network resources demanded by increasingly video-centric consumers and Internet of Things (IoT) businesses that need local compute.

How we got to the Edge: consumer demand

Internet traffic is booming, driven largely by video-centric consumers using smartphones. In 2021 there will be 127 times more global Internet traffic than there was in 2005, reports Cisco. So much of that traffic is video that “IP video traffic will be 82 percent of all consumer Internet traffic by 2021, up from 73 percent in 2016,” and up from 12% in 2006.

It’s notable that the smartphone was introduced in 2007. Mobile devices have created an environment where consumers expect to have the information they want, anywhere, anytime. Increasingly, the information they want is video with 60% of all video clips are consumed on mobile devices, according to Ooyala.

The proliferation of video streaming – over the top (OTT) video – seems to be an unstoppable trend, such that 40% of the peak Internet traffic in the U.S. is Netflix, reports Statista. With more than 50 million US subscribers, Netflix subscriptions exceed all the top cable company subscriptions combined.

Social media plays a big part in video consumption today. Ten years ago, there was no Facebook, Twitter, or Instagram. Now video-centric social media proliferates.

Executives Guide to Edge Computing [white paper]

Edge Improves download speeds and video buffering

Insatiable consumption of video created an environment with lag and buffering. It’s better thanks to changes in Internet infrastructure. A big improvement came from content companies using content delivery networks. CDNs let companies connect their content directly to local and regional ISPs, bypassing areas of Internet slowness.

On top of that is $1.6 trillion of investment in U.S. telecom infrastructure in the last 20 years, according to USTelecom. Wireline, wireless, and cable companies put fiber deep in the network, splitting nodes and moving nodes closer to homes, and laying fiber to nodes and homes. In the last several years we’ve seen a lot more local connections and bypassing of big Internet pipes that cause latency and slowness, which has dramatically improved the video experience for most consumers.

Edge data centers are robust and provide local connectivity

Download speeds have improved, but upload speeds need to catch up in an environment where consumers generate and upload their own videos, including live streaming. Making YouTube videos and engaging in Twitter Live, Instagram Stories, and Facebook Live are interactive experiences that require faster upload speeds. Edge data centers get social platforms better network connections closer to their users.

As a country, the U.S. lags on Internet speeds, and we’re even further behind on upload speeds, which require fiber to the home. Unless you have fiber directly to your home – which is the trend, but not available everywhere — then you don’t have symmetrical download speeds and upload speeds. Faster upload speeds are needed for changing consumer usage patterns.

Also read: 3 Data Center Location Considerations

IoT use case for edge computing

The Internet of Things is another trend we’re seeing that is leading business to seek edge data center resources. IoT is very local and distributed. IoT devices require information and act on that information locally. In many cases, it doesn’t make sense to send that data far away for compute and then to send it back – only to generate latency. In addition to rich fiber in the last mile, compute and networking power are needed at the Internet’s edge.

TierPoint data centers deliver edge computing and connectivity

TierPoint helps companies create good edge networking paths and edge compute paths to analyze data that’s coming at the Internet’s edge. For example, we help social media platforms move content closer to users and find good networking interconnections to get to the last mile.

We also help Internet of Things (IoT) businesses with compute needs where local data needs to be analyzed and a decision needs to be made locally – without incurring latency to a distant data center.

We’re building more data centers, sized to accommodate our current regional customer base and new customers with latency sensitivities and last-mile connectivity requirements. Our edge data centers have the robust network connections that content companies need to get to their end users. The size of these edge data centers also drives economies of scale in power, cooling that benefit all customers, including businesses that need local compute to cut latency. And, of course, these data centers are highly secure.

Conclusion:  the move to the network edge

Gartner said recently, “Currently, around 10% of enterprise-generated data is created and processed outside a traditional centralized data center or cloud. By 2022, Gartner predicts this figure will reach 50%.” Today, most compute is done far from where the activity is taking place. In the future, that will change: much more compute activity will take place at the network’s edge to avoid delays that surround sending massive amounts of data to distant places.

Contact us to learn how edge computing and edge data centers can fit into your IT infrastructure and digital transformation strategy.

Dominic RomeoDominic Romeo is a Senior Product Manager and is responsible for all things network-related at TierPoint. In addition to helping create new products and answering detailed questions to tackle specific issues, he gathers feedback from customer interactions to guide product improvements and create new solutions to meet evolving business needs.

New call-to-action

]]>