Cisco Cisco Tetration Analytics G1 Libro bianco

Pagina di 9
 
 
© 2016 Cisco and/or its affiliates. All rights reserved. This document is Cisco Public Information. 
Page 2 of 9 
And even if an organization performs manual application dependency mapping (ADM), it is not a one-time 
operation. Mappings must be maintained as living documents. Otherwise, the source of information will 
increasingly diverge from the source of truth (the actual application traffic). This state can create new challenges 
for the data center team as the team attempts to manage changes to the network based on misinformation. 
Difficulties of Application Dependency Mapping 
Mapping applications has traditionally been a difficult task, from both business and technical perspectives. 
Mapping anything in this world is generally a problematic and laborious process. For example, the ocean occupies 
roughly 70 percent of the earth’s surface, yet we have mapped only about 5 percent of it in detail (Schmidt Ocean 
Institute, 2013). 
The application landscape of a data center can be conceptualized as an ocean. You can understand from the 
surface level what appears to be running, but below the surface is an entire world of interrelated flows that connect 
all the surface components together. Truly grasping these deep flows is what makes ADM difficult because the 
flows are not easy to discover, analyze, and map. 
Currently, three approaches are usually used: 
● 
Manual approach: In this approach, the network team attempts to manually collect application information 
across groups that have deployed applications on top of the network infrastructure. 
● 
Simple automated approach: In this approach, simple automated tools are run in an attempt to collect ADM 
information. 
● 
Outsourced approach: An external agency combines the manual and native approaches and then evaluates 
the collected information and compiles it into a report. 
Most organizations find the manual approach labor and cost intensive, and from a technical perspective it is too 
rigid. Manual ADM is feasible only at a very small scale. The network team also should have a strong relationship 
with the application teams, who must gather the required data. This process is prone to human error and mistakes, 
and the probability of accurately mapping the applications in the data center is low. The collected data is usually 
compiled into a static report that diverges from reality as soon as it is published. 
Businesses may be tempted by the simple automated option. It appears to incur only the low capital costs needed 
to purchase software that claims to be able to map applications, and to require only a small operations team to run 
the ADM tool. However, current simple software implementations of ADM are notoriously ineffective at truly 
mapping the application landscape. 
Often, the ADM software is not to blame. The problem instead is the source data they ingest. With poor or 
incomplete input data, the ADM tool cannot make accurate observations, however revolutionary the mapping 
algorithm may be. Therefore, to complete the mapping process, the low-quality automated recommendations must 
be supplemented by human input, increasing the long-term costs of using this approach. Traditional ADM software 
usually does not learn from this human supervision, causing staff to have to repetitively help the tool. 
Traditionally, capturing enough high-quality source input data at data centers at scale has been a task that only few 
organizations with huge resource pools and equipment have been able to achieve. 
The outsourced approach has the problems of both approaches, and it adds a layer of cost and complexity. It also 
does not usually provide any more actionable or insightful mapping than the other two techniques.