SAN JOSE – Cisco is not perceived to be a Big Data or analytics company. But at its Global Editors Conference here this week, the company repeatedly stressed that that’s not only where they are headed, it’s where they are today.
Cisco CEO Chuck Robbins repeatedly stressed analytics is fundamental to Cisco’s core business moving forward at the event. So did Biri Singh, Cisco’s SVP, Chief Technology Officer, Platforms and Solutions
“Moving to real time analytics and intelligence relates to the speed at which Level 2 and 3 move up the stack and its ability to assess to drive higher level outcomes,” Singh said. “We want to take it to autonomic infrastructure, and take the next set of microservices and container-based stacks and deploy them, and do services delivery around them. Streaming analytics for real-time subsecond decision analysis comes into play with hundreds of billions of events in a day that can be analyzed in real time. We can arrive at a level of understanding that’s not commonplace in the industry.”
Rebecca Jacoby, Cisco’s SVP, Operations, explained how Cisco had used its analytics capability internally to bring about significant savings.
“We put in place an approach around adaptive testing, where the use of analytics in that process allows us to adapt the testing, improve quality and reduce time to implements a change from sometimes weeks to seconds,” she said. “We put the same policy-based and flexible assets in place to manage energy in our supply chain in places as a pilot, and are seeing 20-30 per cent savings. It makes very physical assets in the supply chain very flexible.”
Jacoby said that this architecture, with its flexible interface for developers, and use of rules and policy engines rather than coding, as before, creates an architecture with very flexible provisioning and deprovisioning of assets.
“We have applied this process to the fundamental service delivery process in IT, which allows for automation of the service delivery process, based on a construct of analytics coming together with the rules based engine to manage risk for the company,” she said. “We have achieved really tremendous results across the board, in corporate finance, Cisco commerce, the customer and partner experience and Cisco Services.”
“Cisco isn’t thought of as a Big Data or analytics company,” acknowledged Mike Flannagan, Vice President and General Manager for Cisco’s Data and Analytics Group. “But to do this now, you need IT infrastructure that sits everywhere, and with the exception of desk phones, the network infrastructure is the most ubiquitous piece.”
As a result, while until recently, any high value data was stored in a data warehouse, typically Teradata, Oracle or IBM, now customers are adopting data in a much more distributed way, believing most data will be processed at the edge, in mobile devices, appliances, and routers.
To take advantage of this, Cisco launched its analytics portfolio last December, and has added onto it since.
“While in the past, data access here typically involved interaction with one data source, there has been an explosion of other types of sources, like Hadoop, Cassandra, and MongoDB,” Flannagan said. “We have not less than 10 data source types managing all our corporate data at Cisco.” Technology from Cisco’s 2013 acquisition of Composite Software, which makes data virtualization software that makes it easy to access and integrate data regardless of its location, was important here.
“We use data virtualization to solve a regulatory problem in financial services, where they will have to present an accurate risk assessment which includes a consolidated view of al risk from all sources,” Flannagan said. “That data exists in all kinds of places and is very siloed. This regulatory requirement needs to be done by January 2016, or the institutions will be not fiscally sound.”
Flannagan noted that their analytics data also helped call centres move away from their focus on first call resolution, which had previously been their top priority.
“They learned from the data that customers don’t consider first call resolution to be the main thing,” he said. “Customers want their call to be low effort. They don’t want to be on hold for 45 minutes until you have an answer for them. Let them go and all them back when you have a solution, because they value their time more.”
Flannagan noted real estate companies as another good example of a market for this technology.
“Building centers had no data about who comes into a design center unless a deal closed, then they could ask a purchaser what had been useful in their buying decision,” he said. “But this didn’t capture data from people who didn’t buy, and they wanted to capture data from people who didn’t buy to see what influenced their purchasing decision.”
“The data didn’t have to come from a big new investment, because it all exists today inside devices customers have already paid for, so we just helped them extract it and use it,” Flannagan said. “They had Cisco wi-fi access points and video cameras for physical security. We helped them use those to pull data and create a dashboard view of their design centers that shows them their traffic. The wi-fi tells how many people came in, and what days, but the wi-fi isn’t accurate enough to tell what counters they stood at, whether they looked at say, carpet or hardwood. The video data is though, and that helps them understand engagement. Using video analytics we can tell if the customer is a man or woman and their approximate age, with about 95 per cent accuracy, and it’s all automated.”
Doing this kind of analysis at the edge rather than in a datacentre also helps Cisco deal with privacy concerns, Flannagan said.
“Cisco has a positive differentiation here. Because we do analytics at the edge, we analyze the stream in the retail store, and it’s discarded after it is analyzed. Once we score the data, we discard it, so no picture of you is transported off-site.”
Finally, Flannagan stated that network analytics is a great incremental opportunity for Cisco partners.
“Network analytjcs is a great channel opportunity to go back to customers and say they can provide intelligence on top of the network, like showing how particular features they choose can create some unnecessary complexity,” he said.