Last month I had the pleasure seeing Don Duet, Co-Head of the Technology Division at Goldman Sachs, give a keynote presentation at the Gartner Data Center conference regarding the company’s data center strategy. As the global investment firm looks to commoditize an array of new digitalized services and products to its multinational workforce, the data center is being asked to increase agility while mitigating risk in the fluid, and sometimes volatile, world of investment banking.
For Duet and Goldman Sachs, agility is a catalyst for mitigating risk for the business. The IT organization brings new services and products to market faster via a combination of localized data centers to support the company’s global workforce, virtualized and cloud-computing environments to improve resource availability and embracing and incorporating new data center technologies despite the fact it adds complexity. As we’ve seen from our own financial service customers, an enterprise-grade IT automation strategy parallels these computing trends in the push to mitigate risk by building a data center that’s scalable and designed to accommodate for change as IT looks to lead the charge into the digitized age.
Recent McKinsey & Company research further underscores this in their Enterprise IT Infrastructure Agenda for 2014 paper. As more business value migrates online and business processes become more digitized, IT infrastructure inevitably becomes a bigger source of business risk. “Even after years of consolidation and standardization, which have led to huge improvements in efficiency and reliability, most infrastructure leaders work in environments that they believe are too inflexible, provide too few capabilities to business partners and require too much manual effort to support. Addressing these problems to create more scalable and flexible next-generation infrastructure will require sustained actions in multiple dimensions,” says Bjorn Munstermann in the paper.
These “digitized” processes McKinsey & Company refers to reside and are often times automated within the data center, and as a result, place the business at increased risk because the IT organization is leveraging an outdated automation strategy that is fragile, unscalable and not designed to accommodate for change. This is a common pain point for many of our own financial service customers who leverage ActiveBatch for the automation of data center workloads that support business processes for the investment bankers and their systems. For example, automating datacenter, ETL-type processes that bring in market and financial data, format it and update trading desk systems/applications in near real-time. Workflows such as these aren’t static, but rather dynamic processes that are constantly changing based on market demands and competitive pressure. As new data sources are identified or traders and analysts require access to new data sets, the ability of the data center to update the underlying data integration processes is critical.
Leveraging a fragmented collection of scripts and platform-specific scheduling tools represents an uncoordinated collection of automation solutions, implemented independently of each other and that require constant revision and re-synchronization in the face of change. That being said, in the face of data center diversity, such a strategy represents risk to the business by building a barrier to data center agility. The enterprise-grade automation strategy that we’ve seen so many financial service companies adopt is because it allows the datacenter to accommodate change or, as Duet said, it allows the datacenter “to use agility as a risk mitigator by allowing it to embrace complexity and change; two terms that have never been particularly popular with the CIO.”