Air Flow Presentation Pdf

Embed Size (px)


It is recognized within the industry that most data centers are not energy efficient. Traditional data center designs do not fully address optimizing the data center. While data center managers struggle with uptime and reliability, business executives are looking for ways to reduce capital and operational expenses to improve the bottom line. Green initiatives are also in place to not only save money but to be environmentally responsible. New green data center designs (based on hot and cold air containment) have started to become more popular. Containment strategies and air flow optimization are recognized as a way to achieve both technical and business objectives. By separating hot and cold air within the data center, capital and operational expenses can be reduced for the business and a more stable and predictable environment can be achieved for the IT organization.

Text of Air Flow Presentation Pdf

  • 1. Data Center Air Flow SolutionsConverting Chaos into Order Managing Your InfrastructureThe Smart Way

2. Agenda Industry Trends and Findings Data Center Issues Wright Line Strategies Wight Line Product Solutions Air Flow Management Products (Wright Line and Vendor Neutral) Heat Containment Isle Containment Independent Containment Managing Your InfrastructureThe Smart Wright Line LLC, 2007 Way Confidential and Proprietary 2 3. Industry Findings and Trends Power consumption is so significant that accommodating the increase inIT power consumption over the next five years will require the U.S. toconstruct 10 major power plants. If current trends continue, we will needanother 20 between 2010 and 2015. Thats 30 power plants that need tobe built just to accommodate the growth within IT power. Source: Uptime Institute Forecasts indicate that unless energy efficiency is improved beyondcurrent trends, the federal governments electricity cost for servers anddata centers could be nearly $740 million annually by 2011, with a peakload of approximately 1.2 GW. Source: Report to Congress on Server and Data Center Energy Efficiency Public Law 109-431 U.S. Environmental Protection Agency ENERGY STAR Program , August 2, 2007 Managing Your InfrastructureThe Smart Wright Line LLC, 2007 Way Confidential and Proprietary3 4. Industry Findings and Trends People think that a $2,500 server is so cheap you dont have to worryabout it. But over three years, the cost of electricity nearly equals thecost of the server and thats without the Capital Expenditure for buildingthe data center or the cost of running it. Source: Uptime Institute All data centers should optimize air flow, but this step is especiallyeffective in legacy data centers where air flow management was notconsidered at build-out or where current implementation is conventionalhot-aisle cold-aisle set-up. In these arrangements there is poorseparation between the cold supply and hot return airstreams. Creating aphysical barrier separating hot and cold airstreams to provide the highestdegree of separation. Any of three approachescold aisle containment,hot aisle containment, and rack containment could provide the physicalseparation with each one offering its own advantages and limitations. Source: Silicon Valley Leadership Group Business will continue to look for ways to reduce data center operationalcosts. Data Center Energy Consumption has reached the CEOs deskand will continue to become a strategic issue. Source: McKinzey Report Managing Your InfrastructureThe Smart Wright Line LLC, 2007 Way Confidential and Proprietary4 5. Industry Findings and Trends Data Center Facilities Reducing Data Center Power Consumption Data Center Power Consumption is Getting More Important As a major concern power consumption/conservation increased from 48% to 55% The behavior concerning power consumption has begun to change "When the power bill gets really significant, it ends up on the CEO's desk. "When hesees that the biggest user is IT, IT has to deal with it." Still 28% of survey respondents don't know whether their power bill has increasedor decreased though this is an improvement over the 36% who didn't know lastyear. For the respondents who are paying attention, a majority see major increases in thepower bill for their data centers. 44% have seen an increase, and 19% say theincrease is greater than 10%. Hot Aisle/Cold aisle Containment -- the practice of sealing hot aisles and cold aisles in a data center gained traction in 2009 Some data centers do hot-aisle and cold-aisle containment themselves, and some buy a system from a provider, such as Wright Line, APC and Liebert Containment really wasn't on the radar until late last year, but already 30% of respondents have implemented it, and an additional 15% plan to next year The cold-aisle containment strategy is slightly more popular with respondents than is hot-aisle or plenum containment Managing Your InfrastructureThe Smart Wright Line LLC, 2007 Source: Uptime Institute WayConfidential and Proprietary 2009 Data Center Survey5 6. Industry Findings and Trends Increasing Computer Room Air Conditioner Performance ASHRAE TC9.9 and Ambient Conditions Dry Bulb Temperature 68 to 77 Degree FahrenheitThe trick is there is usually a range of temperatures, not one temperature 40 55% RH at Air Inlet Its all about the Delta T (^T) 30 Ton Units are Not Really 30 Ton Units. Its 30 Tons at some operatingconditions Examples A 30 Ton nominal,downflow, chilled water CRAC unit may have a sensible capacity of 27 Tons at 72 degrees, but only 23 Tons when operated at 68 degrees. 18% more capacity at the higher operating temperature! Capturing return air at a higher temperature allows for a higher Delta T and increase performance. 88 degrees entering the CRAC will produce 30 Tons. A 30% improvement! Source: Data Center Journal Managing Your InfrastructureThe Smart Wright Line LLC, 2007 Way Confidential and Proprietary 6 7. Industry Findings and Trends Increasing Chiller Temperature Yields Significant Savings Normally, for centrifugal compressor-based chillers, an increase of onedegree in the chilled water supply temperature can increase theoperational efficiency of the chiller by 1 to 2 percent. If a chiller cansupply chilled water at 55F, it will be approximately 15 to 30 percentmore efficient than when it produces chilled water at 40F (cooler).Source: Dr. Tengfang XuJune 15, 2005,Lawrence Berkeley National LaboratoryManaging Your InfrastructureThe Smart Wright Line LLC, 2007 Way Confidential and Proprietary 7 8. Problems In The Data Center Legacy Designs or Lack Thereof Conventional Designs that did not Anticipate Shifts inTechnology On-Going Changes That Effect The Dynamics Of The DataCenter 24X7X365 Uptime Lack Of Funding - On-Going Capital & Operational Expenses Doing More With Less Perceived and Real Risks Usually Number One Problem from an Immediate Need: Heat and Power ( I have enough cooling, but I still have heatproblems and we are running out of power.)Managing Your InfrastructureThe Smart Wright Line LLC, 2007 Way Confidential and Proprietary 8 9. Current State of Most Data Centers Regarding Air FlowManagement Over Provisioned Air Cooling Adding To Cost Yet Cooling Is Still A Problem 90% Of Data Center Have Enough or Too Much Cooling 60% Of The Cool Air Is Wasted Due To: Air Mixing Air Stratification Bypass Air Why The Problem? Technological Changes That Were Not Planned or Anticipated Immediate Business Needs Creating Short Term Solutions Without Consideration to Efficiency The Cost Of Running A Data Center Has Reached The Radar of CEOs andC Level Executives CapEx and OpEx Expenses Green Initiatives Managing Your InfrastructureThe Smart Wright Line LLC, 2007 Way Confidential and Proprietary9 10. Wright Lines Answers To Technical and BusinessObjectives Wright Line Can Help Clients Reduce Capital & OperationalExpenses Reduce The Need For Cooling Reduce Heat Within The Data Center Increase Density Levels within Enclosures and Increase FloorSpace Implement A More Predictable Environment Maximize Existing Infrastructure Investments Cooling Power Floor Space Less or Lack of Disruption Stranded AssetsManaging Your InfrastructureThe Smart Wright Line LLC, 2007 Way Confidential and Proprietary 10 11. Wright Line Methodology Analyze the Current Environment - Planning Industry Specifications: PUE, DCiE Utilizing Modeling Tools (CFD) Utilizing Practical Horse Sense How much power and AC do I need now and the foreseeable future? Can the current data center be adapted to the new technology? What are the biggest pay backs? What is the best phased approach to move forward? Our Solutions Utilize Services and Enabling Technology Provides a Road Map To Achieve the Technological and BusinessObjectives to Provide The Fastest ROI That Is Practical In YourEnvironmentManaging Your InfrastructureThe Smart Wright Line LLC, 2007 Way Confidential and Proprietary11 12. Why Chaos Cooling in a Data Center? Legacy cooling designs employ an open supply and return air methodology that drives mixing of both supply and return air streams Cool air in a legacy data center is used for many purposes Cool IT equipment Keep warm air away from IT inlets Move warm air toward the return system Minor changes in any element of the data center create unpredictable behavior which decreases reliabilityManaging Your InfrastructureThe Smart Wright Line LLC, 2007 Way Confidential and Proprietary 12 13. How Does Chaos Manifest Itself in the Data Center? Recirculation of air from IT equipment exhaust finds itsway to IT inlet and can reduce server performance andeven cause servers to stop working Air stratification is the layering of different temperature airmasses and forces set points of precision coolingequipment to be lower than recommended Bypass Air is the remixing of cool supply air that directlyenters the return air stream and drives down precisioncooling efficienciesTo prevent processing impact, data centers producesignificantly more cold air than is required for ITdevices Managing Your InfrastructureThe Smart Wright Line LLC, 2007 Way Confidential and Proprietary 13 14. Recirculation Hot air exhaust circulatingback into its own intake cancause device thermaloverload. Typical manufacturer inlettemperature threshold fordevice operations is 95F. Exceeding manufacturersoperating device thresholdcan lead to unplannedcomputing system outagesand data loss. Managing Your InfrastructureThe Smart Wright Line LLC, 2007 Way Confidential and Proprietary 14 15. Temperature Stratification Significant gradient of airtemperatures beyondASHRAE TC9.9 placesdevices at risk of thermaloverload Maintaining inlettemperature gradientswithin the ASHRAErecommended rangesignificantly saves energy Manufacturer ASHRAE TC9.9SpecificationDevice InletTemperatureRange(64.4 - 80.6)F (50 - 95)F Managing