Cisco Cisco Aironet 2600e Access Point 白書
2 |
P a g e
Test Methodology
A team of 3 graduate students, with supervision from a faculty member responsible for the CCENT
wireless testing projects, worked with technical representatives from Cisco’s Wireless Networking
Group (WNG) to configure Cisco’s new failover services on the wireless test-bed in our lab. The
testbed consisted of Cisco’s failover-enabled wireless network infrastructure and a range of typical
enterprise application services that we used to evaluate failover services. We focused our efforts on
three commonly deployed enterprise applications - Citrix XenDesktop 5.6 VDI, Microsoft Lync 2010,
and Windows file sharing using Common Internet File System (CIFS). We also used Adobe Connect to
assess an educational application and a Cisco 7921G phone to assess the experience for a VoIP
phone. To measure the application downtime, we used online time meter, which allowed us to
measure application response time in milliseconds. The timer was started once we detected that an
application stopped working and the timer was stopped once applications started to work again. In
addition, we also used a ping utility to record exact times when client sessions were dropped and
subsequently restored. To get granular measurement for Client SSO, the delay between each ping
packets was set to 100 milliseconds so we were able to measure system recovery time by counting
the number of dropped pings and multiplying times 100 milliseconds.
wireless testing projects, worked with technical representatives from Cisco’s Wireless Networking
Group (WNG) to configure Cisco’s new failover services on the wireless test-bed in our lab. The
testbed consisted of Cisco’s failover-enabled wireless network infrastructure and a range of typical
enterprise application services that we used to evaluate failover services. We focused our efforts on
three commonly deployed enterprise applications - Citrix XenDesktop 5.6 VDI, Microsoft Lync 2010,
and Windows file sharing using Common Internet File System (CIFS). We also used Adobe Connect to
assess an educational application and a Cisco 7921G phone to assess the experience for a VoIP
phone. To measure the application downtime, we used online time meter, which allowed us to
measure application response time in milliseconds. The timer was started once we detected that an
application stopped working and the timer was stopped once applications started to work again. In
addition, we also used a ping utility to record exact times when client sessions were dropped and
subsequently restored. To get granular measurement for Client SSO, the delay between each ping
packets was set to 100 milliseconds so we were able to measure system recovery time by counting
the number of dropped pings and multiplying times 100 milliseconds.
In order to ensure accurate measurements, the testbed was isolated from the University’s wired and
wireless systems. A Cisco 2911 Router configured for NAT services was used to provide access to
testbed devices, when necessary, from the University network. Our network testbed included a Cisco
3750 Catalyst Switch, a Cisco UCS C210 server running VMware ESXi 5.0 that hosted all our Virtual
machines, including an Active Directory Domain Controller, Lync 2010 Server, Citrix XenDesktop
controller and VMs (see Figure 1 below). For tests involving Adobe connect, we accessed those
services from Syracuse University’s network, through which we established and tested video
collaboration sessions between two wireless clients on our testbed. All testing was performed with
both Macbook Pro and Dell laptops Latitude E6430 equipped with 3-stream capable 802.11n network
adapters. In our preliminary testing, we were not able to detect any differences between these two
client types. The network was configured using WPA2-Enterprise authentication and PEAP
authentication on the backend RADIUS server. All reported test results are based on the Dell laptops.
All performance testing took place on the second floor of Hinds Hall, home to SU’s School of
Information Studies, in a typical enterprise cube office (inside the CCENT Lab).
wireless systems. A Cisco 2911 Router configured for NAT services was used to provide access to
testbed devices, when necessary, from the University network. Our network testbed included a Cisco
3750 Catalyst Switch, a Cisco UCS C210 server running VMware ESXi 5.0 that hosted all our Virtual
machines, including an Active Directory Domain Controller, Lync 2010 Server, Citrix XenDesktop
controller and VMs (see Figure 1 below). For tests involving Adobe connect, we accessed those
services from Syracuse University’s network, through which we established and tested video
collaboration sessions between two wireless clients on our testbed. All testing was performed with
both Macbook Pro and Dell laptops Latitude E6430 equipped with 3-stream capable 802.11n network
adapters. In our preliminary testing, we were not able to detect any differences between these two
client types. The network was configured using WPA2-Enterprise authentication and PEAP
authentication on the backend RADIUS server. All reported test results are based on the Dell laptops.
All performance testing took place on the second floor of Hinds Hall, home to SU’s School of
Information Studies, in a typical enterprise cube office (inside the CCENT Lab).