Cisco Lab Renovation

Cisco Lab #1305 - MSI Company Project

Cisco is $49 billion dollar company and a worldwide leader in IT. They design, manufacturer and sell networking equipment. Cisco is the largest networking provider in the world with 85% of all Internet traffic travelling through their system. Cisco has data centers located around the world including Boxborough, Massachusetts, the location for this award submission. Data centers in general house a network’s most critical systems and are vital to the continuity of daily operations. The security and reliability of data centers and their information is a top priority for Cisco.

MSI Mechanical Systems was invited to bid on the renovation and expansion of an existing data center Cisco was building, called Lab 1305. The general contractor on the job, Votze Butler Associates asked MSI to participate in the bidding process which focused on providing environmental control for the data center through a new pump room which included; computer room air conditioners (CRAC), pumps, and new chillers. 

The sole purpose of a data center cooling technology is to maintain environmental conditions for information technology equipment. Cooling comfort is not the primary purpose, but keeping the IT equipment is. With any well-designed system, a data center cooling system should operate continuously and reliably. If a cooling system malfunctions, the chance of losing all servers and critical data stored is a possibility.

MSI added twenty-three, 50-ton CRAC units for the new data center along with two additional, 250 ton cooling chillers to be added to the roof next to the existing five chillers.

The biggest challenge MSI faced was the fact that this was a renovation and expansion to an existing data center; the design & build was tapping into the existing cooling system already in place and one that could not be shut-down while MSI worked on the project.

Cisco did allow an approved three-day shut-down to allow MSI to swap from the old cooling system to the new one, but this shut-down was for the entire project; meaning every single subcontractor on the project was going to be onsite getting the data center up and running.
MSI would need to work in conjunction with the other subcontractors to get the project to the point where simultaneously the system would start.

The biggest challenge was building the new cooling system without the ability to tap into the existing operational system. MSI and its team developed a customized project timeline that clearly planned out each part of the project development leading up to the shut-down. By creating the timeline this allowed the team to stay on track and to complete each stage of the project leading up to the limited three-day timeframe.

Cisco provided ample space to prefabricate all the piping on-site. MSI carefully marked and labeled where installation of the new piping was going to be installed, but it was still going to be a challenge to rebuild a piping room in that short of time. For six weeks leading up to the planned shut-down, the MSI team worked to prepare for the tie in.

Any data center equipment generates a considerable amount of heat in a relatively small area. This is because every watt of power used by a system is dissipated into the air as heat. Unless the heat is removed, the ambient temperature will rise, eventually beyond design specifications resulting in electronic equipment failure. That’s why its so important for Cisco to have a properly managed air conditioning system in their data centers and running at all times.

As part of the only planned shut-down during this project, Cisco planned to rerun their networking to another facility giving MSI, Votze Butler and the other subcontractors the time to swap over to the new cooling system. This meant everyone had to be in sync and ready to connect during the shut-down.

Once the existing cooling system was turned off, MSI’s priority was to conduct multiple wet taps in the original piping for draining purposes. They had to drain the existing glycol in the pipes.  Tie into the new 12” header and keep the existing 8” header active and installing a series of valves so that the existing 8” header could be isolated and drained at the end of the project.  This would also allow one pump at a time to be isolated and drained to be changed out to a new larger pump.

The work was completed successfully and without any issues, because the team prepared a working solution during the six weeks they worked offline. When it was time to make the official switch over to the new system, everything performed as planned without any issues. MSI worked successfully with the other subcontractors to ensure a smooth transition. On Monday morning when the cooling system was fully operational during normal business hours, MSI was there to guarantee no issues arose – which none did. We are committed to providing service after installation and continue to maintain a successful working relationship with Cisco.

<< Go back to the previous page

Tags : Cisco MSIMechanical DataCenter