Put Your 'Carbonivore' Data Center on a Diet
Article written and contributed by Michael Dickman, Chief Product Officer, Gigamon.
Questions like "How can I do better by my organization's bottom line?" and "How can I do better by our planet?" run through organization leaders' minds daily. Increasing economic and environmental pressures are shaping business priorities and practices more and more each passing week. In my recent article, "How to Put Your 'Carbonivore' Data Center on a Diet," I discussed how data centers currently consume about 3 percent of global electricity and account for roughly 2 percent (and rising) of all global carbon emissions — equivalent to the entire airline industry. For reference, the entire energy sector from the manufacture of fertilizers, pharmaceuticals, refrigerants, oil, and gas extraction accounts for 3.6 percent of global carbon emissions.
It's perhaps no surprise that our customers are telling us that energy efficiency — and carbon emissions — matter. Reducing operational complexity is a cornerstone for all Gigamon efforts, which naturally lends itself to helping our customers reduce power consumption and energy costs.1
We've been working extensively behind the scenes to validate and quantify how Gigamon can support our customers' growing demands for energy efficiency. And we're pleased to share that now we can demonstrate and prove these capabilities.
How Does Gigamon Help Customers Reduce Carbon and Costs?
One of the most strategic ways to approach the goal of reducing power consumption and energy costs along with underlying carbon emissions is to determine what network traffic is processed by which tools. For every kilowatt-hour invested in Gigamon, organizations can save up to 11 kilowatt-hours or more in tool efficiencies by significantly reducing network data processing.4 The Gigamon Deep Observability Pipeline enables our customers to gain visibility and streamline efforts across their data centers by applying the following techniques:
1. Application Metadata: Application Metadata extracts and summarizes application, protocol, and session context about network traffic, efficiently delivering this Layer 4–7 intelligence to third-party tools, such as security, observability, and performance monitoring tools.
2. Application Filtering: Application Filtering identifies well-known applications by traffic signature, even when encrypted, creating a triage system of high-risk and low-risk data by filtering out traffic from high-volume trusted apps.
3. De-duplication: De-duplication identifies and removes duplicate packets before sending network data to tools, resulting in fewer redundancies.
4. Flow Mapping®: Flow Mapping sends specific subnets, protocols, VLANs, and other types of traffic to specific tools to ensure that only the relevant network data is sent to address each tool's unique requirements.
5. Flow Slicing: Flow Slicing is a highly efficient optimization method that drops non-initial packets in every user data session.
In the scenario above, our customers could reduce the power consumption, carbon emissions, and energy costs of their tools' infrastructure by as much as 87 percent over a five-year period. That's good for the bottom line and our planet.
Introducing Our Power Savings Calculator
We recently expanded our popular Cost Savings Calculator to now include a Power Savings Calculator that enables our teams to provide customers with a true sense of energy reduction opportunities within their unique tools infrastructure. This, in turn, enables customers to calculate the overall energy reduction opportunity across their hybrid cloud infrastructure.