ImageVerifierCode 换一换
格式:PDF , 页数:5 ,大小:1.24MB ,
资源ID:455529      下载积分:10000 积分
快捷下载
登录下载
邮箱/手机:
温馨提示:
如需开发票,请勿充值!快捷下载时,用户名和密码都是您填写的邮箱或者手机号,方便查询和重复下载(系统自动生成)。
如填写123,账号就是123,密码也是123。
特别说明:
请自助下载,系统不会自动发送文件的哦; 如果您已付费,想二次下载,请登录后访问:我的下载记录
支付方式: 支付宝扫码支付 微信扫码支付   
注意:如需开发票,请勿充值!
验证码:   换一换

加入VIP,免费下载
 

温馨提示:由于个人手机设置不同,如果发现不能下载,请复制以下地址【http://www.mydoc123.com/d-455529.html】到电脑端继续下载(重复下载不扣费)。

已注册用户请登录:
账号:
密码:
验证码:   换一换
  忘记密码?
三方登录: 微信登录  

下载须知

1: 本站所有资源如无特殊说明,都需要本地电脑安装OFFICE2007和PDF阅读器。
2: 试题试卷类文档,如果标题没有明确说明有答案则都视为没有答案,请知晓。
3: 文件的所有权益归上传用户所有。
4. 未经权益所有人同意不得将文件中的内容挪作商业或盈利用途。
5. 本站仅提供交流平台,并不能对任何下载内容负责。
6. 下载文件中如有侵权或不适当内容,请与我们联系,我们立即纠正。
7. 本站不保证下载资源的准确性、安全性和完整性, 同时也不承担用户因使用这些下载资源对自己和他人造成任何形式的伤害或损失。

版权提示 | 免责声明

本文(ASHRAE NY-08-003-2008 Current Best Practices in High-Density Cooling Applications《高密度冷却应用中的目前最佳实践》.pdf)为本站会员(towelfact221)主动上传,麦多课文库仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对上载内容本身不做任何修改或编辑。 若此文所含内容侵犯了您的版权或隐私,请立即通知麦多课文库(发送邮件至master@mydoc123.com或直接QQ联系客服),我们立即给予删除!

ASHRAE NY-08-003-2008 Current Best Practices in High-Density Cooling Applications《高密度冷却应用中的目前最佳实践》.pdf

1、12 2008 ASHRAEABSTRACT Since its inception, ASHRAE Technical Committee 9.9 haspublished books, articles, and transaction papers in an effortto establish current best practices applicable to data centerdesign and operation. Unfortunately, facilities managers, ITprofessionals, and design engineers con

2、tinue to practice theirspecialties in the same manner that has been practiced for 10or 20 years. Complicating this issue is that the higher loaddensities experienced in todays data centers have made thegoal of 100% uptime more challenging to achieve, while at thesame time making the consequences of

3、downtime more damag-ing to the businesss operations. Coupled with the need to keepenergy costs down and the drive to practice sustainable design,a complete new set of Best Practices have been developed thataddress todays complex issues of data center design.INTRODUCTIONIn the design of data center c

4、ooling systems, there existaccepted “best practices” that have been in use for two or threedecades in essence, dating back to the advent of mainframecomputers and the development of the “precision cooling”concept. Those best practices were developed by varioussources (e.g. equipment vendors, end use

5、rs, designers, etc.)and reflected a variety of conflicting environmental guide-lines. In more recent years, a steep increase in load densitieshas necessitated the abandonment of the older standards andthe development of newer “best practices”. These morecurrent best practices were developed (and con

6、tinue to bedeveloped) by a consortium of industry practitioners, and aresummarized in standards, transaction papers, articles, andbooks that have been published by ASHRAE TC9.9. One ofthe more important of these publications, “Thermal Guide-lines for Data Processing Environments” (ASHRAE, 2004)refer

7、red to as “Thermal Guidelines” from hereon in, definesthe range of appropriate temperatures and humidities for thecritical equipment needing to be cooled.This paper addresses several of the “newer” best prac-tices. Although some are not explicitly addressed in the publi-cations noted above, these pr

8、actices should be followed inorder to achieve the design goals addressed by the “ThermalGuidelines”.Five best practices are addressed in this article; each canbe simply stated as follows:1. Separate the hot and cold air streams to the extentpossible.2. Use as high a supply air temperature as feasibl

9、e whilemaintaining the recommended temperature range of 68Fto 77F at the inlets to the computer equipment beingcooled3. Modulate the cooling capacity to control the supply airtemperature4. Use a dedicated outside air system to control spacehumidity, to control space pressurization, and to meetventil

10、ation requirements (applicable for non-economizerapplications only)5. Use dew point (or absolute humidity) controlThe consequences of each basic principle are far-reach-ing. This article describes how each of these 5 principles, whenapplied properly, can improve the overall effectiveness andenergy e

11、fficiency of the data center cooling system.Current Best Practices inHigh-Density Cooling ApplicationsVali Sorell, PEVali Sorell is a senior associate for Syska Hennessy Group, Inc., Charlotte, NC.NY-08-0032008, American Society of Heating, Refrigerating and Air-Conditioning Engineers, Inc. (www.ash

12、rae.org). Published in ASHRAE Transactions, Volume 114, Part 1. For personal use only. Additional reproduction, distribution, or transmission in either print or digital form is not permitted without ASHRAEs prior written permission.ASHRAE Transactions 13BEST PRACTICE #1:SEPARATE THE HOT AND COLD AIR

13、 STREAMSChapter 4 of the “Thermal Guidelines” discusses how toplace data processing equipment within a space to encouragethe establishment of a separation of hot and cold air streams.The most immediate question to ask is why bother separatingthe air streams at all. Given that the separation is betwe

14、en hotand cold air, its safe to assume that this separation will createstratification. How does this help? In order to understand thisbetter, its important to understand the concepts of bypass airand recirculation air.Bypass air is conditioned air that returns to the air handlingunit without having

15、passed through (i.e. cooled) the equipment.Recirculation air is the hot air from the computer equipmentdischarge that is captured back at the computer equipmentinlet. These two types of air streams are interrelated. If airbypasses the computer equipment, room air must make up thedifference consumed

16、by that equipment this will be recircu-lation air. In the perfect data space cooling configuration, allcooling air passes through the computer equipment, and all hotcomputer equipment discharge air reports directly to the airhandling unit (AHU or CRAC) to be cooled again. Typically,if the air consum

17、ed by the computer equipment is defined as100% flow, the AHU unit flow should be somewhat greater inorder to assure that recirculation air is eliminated or signifi-cantly reduced. The larger the AHU airflow relative to thecomputer equipment airflow, the lower the recirculation air.Increasing the air

18、flow to the point where the recirculationairflow is eliminated seems like the perfect approach; howeverin practice its very difficult to know exactly how much airneeds to be provided into the cold aisles to stop all recirculationair. In fact, recirculation air is not ever completely eliminatedas can

19、 be deduced from the simple observation that the airtemperature entering the data equipment is rarely equal to thesupply air temperature. The best one can do is to find a balanceunder which some recirculation air is tolerated provided that allthe computer equipment inlet conditions fall within the “

20、Ther-mal Guidelines” envelope. Increasing the airflow beyond thisbalance condition is wasteful in that every additional CFMgoes straight to bypass. Fan energy is used to move this extraair, and because it is bypass air (which lowers the return airtemperature at the AHU) it also reduces the efficienc

21、y of theheat transfer process at the AHU coil.The amount of recirculation air and bypass air can bequantified and reported respectively as Supply Heat Index(SHI) and Return Heat Index (RHI) (Sharma, 2002). In addi-tion, the Rack Cooling Index (RCI) can be used to measurecompliance to any environment

22、al specification, such as the“Thermal Guidelines” (Herrlin, 2005). Using these indices,coupled with CFD modeling or actual field measurements, thedesign engineer can optimize for the minimal air flow requiredto just meet the required conditions at the inlets to thecomputer equipment. Although the ac

23、tual application of theseindices is beyond the scope of this article, the referenced liter-ature can provide the methodology for quantifying these vari-ous performance indices.Luckily, one does not have to perform these analyses withevery data center design project. Following these simple bestpracti

24、ces will help get close enough to the optimized SHI/RHIand RCI for most applications.The way to minimize the cooling airflow required to coolthe computer equipment or cabinets is to implement allmeasures to separate the cooling air flow from the hotdischarge air, to the extent possible. These are no

25、t necessarilymeasures that are under the control of the design engineer, yetthe design engineer must work closely with the IT profession-als and the facilities managers to assure that the spaces are laidout in a way to guarantee an effective separation between thehot and the cold air streams. The st

26、eps required to achieve thisstrategy are summarized below:1. Select a data space to be as tall as possible. If the floor tostructure height is not fixed (i.e. if greenfield constructionis an option) the separation between hot and cold airstreams will be better if that height is maximized.2. Create a

27、 hot aisle/cold aisle configuration. Various meth-ods can be used for achieving a separation of hot and coldairstreams. For vertical underfloor air distribution (VUF),this involves placing perforated floor tiles in the coldaisles only. Likewise, for vertical overhead air distribu-tion (VOH), this in

28、volves placing the supply air grillesover the cold aisles. In-row coolers or above-cabinet typevertical discharge coolers, in essence, pull hot air out ofthe hot aisles, cool it to reasonable supply air tempera-tures, and inject it into the cold aisles. 3. Block or minimize all sources of bypass and

29、 recirculationair. Cable penetrations can bypass cold air into the hotaisles; gaps or pass-throughs between cabinets canbypass air into the hot aisle as well as recirculate hot airinto the cold aisle; gaps between servers can recirculateand bypass air within the cabinet; gaps between raisedfloor pan

30、els, gaps between raised floor panels and adja-cent walls, and even unsealed openings inside columnscan all contribute to a substantial quantity of bypass air.Relatively easy solutions for each mode of bypass orrecirculation air are available: for cable penetrations inraised floor environments, brus

31、h air restrictors should beused; gaps and pass-throughs between cabinets should beclosed by assuring that cabinets are placed touching eachother with no visible spacing between them; gapsbetween servers and all openings inside cabinets must besealed using blanking panels; all openings between raised

32、floors, walls, and even openings in columns must be fullysealed and caulked.4. Capture the return air to the AHU or CRAC unit as highin the space as possible. The hot aisle air will naturallyrise to the top of the data room due to its lower density.The taller the space, the better will be the separa

33、tionbetween the hot discharge air and the cold air, and thelower will be the likelihood of inducing a large recircu-lating air component. Where CRAC units are utilized,14 ASHRAE Transactionsthis measure may include using return air extensions tothe top of the space; where built-up air handling units

34、 areutilized, partial height walls or barriers should beconstructed to prevent the low-lying cooling air frombypassing into the AHU and encouraging only the hot airat the top of the space to return to the unit. (Where talldata spaces are not an option, a reasonable alternate solu-tion is to place th

35、e point of intake to the coolers directlyin the hot aisle. CRAC units placed in the hot aisle, or in-row and above-cabinet coolers are effective methods forminimizing recirculation air by drawing the hot dischargeair directly into the CRAC units or coolers before it canmix with the cold aisle air.)5

36、. Use the full available height of the data space. If no ceilingis required for acoustical or aesthetic purposes, the sepa-ration of the hot and cold air streams will be best if no ceil-ing is installed. If a ceiling is required, use the ceiling as areturn air plenum (Sorell, 2005). This involves in

37、stalling areturn extension from the CRAC units to the ceiling, andinstalling ceiling grilles directly over the hot aisles.BEST PRACTICE #2: USE AS HIGH ASUPPLY AIR TEMPERATURE AS POSSIBLE“Thermal Guidelines” (ASHRAE 2004) establishes anenvironmental specification for data spaces to which dataequipme

38、nt providers and users have agreed. This environmentof recommended conditions for “Class 1” data centers is 68Fto 77F and 40% RH to 55% RH. More importantly, this envi-ronmental envelope applies only to the inlet of the computerequipment. The temperature or humidity in the hot aisle, nearthe ceiling

39、 of the space, in the return air stream, or anywhereelse in the space is irrelevant to the cooling design. Therefore,the cooling equipment should be selected and operated todeliver air at the conditions of these specifications at the inletsof the equipment.Historically, cooling equipment was selecte

40、d for a supplyair temperature in the range of 55F to 60F. If the measureshighlighted in Best Practice #1 are followed, the supply airtemperature should be very close to the temperature at thecomputer inlets. An inlet temperature of 55F is too cold forthe equipment; there is no reason why a supply ai

41、r temperatureclose to the low end of the environmental specification (i.e.68F) cant be used.If 68F is used as the supply air temperature, the chilledwater supply temperature can be selected at a similarly highersetting. There are three direct benefits of using a higher chilledwater supply temperatur

42、e:1. The chilled water temperature may become higher thanthe space dew point, meaning that the chances of mois-ture developing on the piping and on the cooling coils isreduced to zero. Chilled water piping has traditionallyposed risk to the computer equipment in data spaces notnecessarily from the p

43、ossibilities of leakage but ratherrelated to condensation that eventually destroys thepiping insulation and puts data equipment and wiring atrisk. If a typical space dew point temperature is 50F, thena chilled water supply temperature of 52F to 54F maybe a reasonable selection point.2. The chiller c

44、an be selected and operated at a significantlyhigher efficiency, thereby reducing the operating cost ofthe plant.3. If an economizer is used, the hours of economizer usage(for either airside or waterside economizers) is dramati-cally increased, thereby reducing the hours of operationof the chillers.

45、 BEST PRACTICE #3:USE SUPPLY AIR TEMPERATURE CONTROLCRAC units are by default furnished with the thermostatplaced in the return air stream. This may have its origins in theolder practice of cooling the space to what was believed to bea requirement of a uniform temperature profile of 72F (+/- 1or 2 d

46、egrees) throughout. Todays practice recognizes thatthere is no such a thing as a fixed space temperature, especiallyat high load densities.There is another point that needs to be highlighted. Dataspaces are almost always unevenly loaded. Some areas of thespace may be subjected to high loads, others

47、will have lowloads. Similarly, it is possible that the return air temperaturesacross the space are not uniform. Therefore, one CRAC unitmay see a return air temperature of 70F while another sees80F. The unit with the low return temperature may close offits coil completely in response to what it inte

48、rprets as the spacetemperature, while the other unit may have its coil fully openand producing 60F supply air. The result is that both units areon, one producing 70F, the other producing 60F. This mayhave two effects. If the air is not mixed in the underfloor space,the supply air temperature may not

49、 be uniform throughout.The other possibility is that the supply air flows mix thor-oughly and become diluted with the higher supply air temper-ature of the inactive unit. In essence, the effective supply airtemperature becomes 65F. This is wasteful in that chilledwater is produced at a less energy efficient condition toproduce air at 60F while the effective supply air temperaturedelivered to the space is 65F. Another important issue relates to the strategy of select-ing the controlled variable. If the condition of the air at theinlets to the servers is t

copyright@ 2008-2019 麦多课文库(www.mydoc123.com)网站版权所有
备案/许可证编号:苏ICP备17064731号-1