Last summer, the BB&T Center and Florida Panthers hosted the 2015 NHL Draft. It was a great success and I am proud to work with such a great operations team. On the surface it looks like any other event, but there were a number of unique challenges that our IT group had to meet.
To make the challenge even more exciting, the weekend before the event, the arena was rented for a private event, so load in could not begin until the Monday of that week. All tables, switches, Wi-Fi access points, and cabling all had to be ready, but staged out of sight. Everything had to be ready for Thursday of that week for an internationally televised live event.
To accomplish this, we prepared 30,000 feet of Ethernet cable, 13 additional Cisco 2960-X stackable switches, and added 94 access points in a mix of temporary and permanent installations. 60 temporary Cisco phones were added on the floor ( two per team ) plus another 30 phones scattered around for NHL and media/broadcaster use. Our Internet edge router was upgraded from a 7206-G1 to a pair of new ISR 4431.
The cable was a mixture of temporary backbone port channels that were run to create redundant routes for the Draft network, team tables, and media risers plus 600 dedicated hardline drops for visiting media on the risers both on the floor and in the back office work areas.
This cabling was interconnected using groups of two and three Cisco 2960-X 48 port POE+ switches in temporary cabinets in five stacks. The stacks were connected in a sort of ring with dual GigE port channels. Throughput was not really expected to be an issue as the primary use would be Internet access and sustained throughput ran about 280mbit to the internet split over two providers.
Peak users were about 2,200 simultaneous during the event, with no public access. As we were not used to providing service to that number of clients, we experienced an unplanned side effect and maxed out the single IPv4 port translations on a single IP. We remediated this quickly by restricting users to 1500 maximum translations which throttled the 10-15 abusive clients that were using up all the translations.
The critical parts of the Wi-Fi deployment were not able to be tested in advance because the floor and media risers were not deployed until Monday and we had no opportunity to measure what it would look like with everything constructed and 1000 people on the floor, so we asked CDW to help us with the configuration. Omni access points were placed on every other table and directional antennas using 3702p were used on the sides of the media risers. The mandatory minimum was set to 18, but to do it over again, I would have set to 24 as some clients were still stretching too far if they moved seats.
Each team table had 2x Cisco IP phones that could ring both lines. In all, 6x CAT 5E cables were run to each table with a breakout box under each table. These cables were pre-made so that they could just be rolled out in bundles. This allowed for 2 phones, 2 internet, 1 Draft network and a spare for each table.
Internet was provided primarily by Comcast fiber 300/300 for guests and AT&T 100/100 for in house operations and routed via Cisco ISR 4431 on the edge, distributed by a Cisco 4507 to a total of 27 switch stacks throughout the arena.
It was a tremendous experience to prepare and host this two day live television event, even if it meant a year of planning and weeks of 12-16 hour days. My team had to support normal operations while we built and prepared all this new and temporary infrastructure, but having support from CDW and Cisco made the job easier.