OPINION: Why Are We Arguing About Clouds?
How much are organizations willing to pay for their peace of mind when it comes to their IT infrastructures?
I only ask because the Depository Trust & Clearing Corp. has published a white paper extolling the benefits of hosted services, such as the public cloud.
It’s a strange conversation to have in 2017 until you realize that the DTCC plans to move a greater portion of its services and applications to the public cloud within the next three to five years according to the white paper’s authors.
Being member-owned, I’m sure that there are a few stakeholders who want to wait until hell freezes over before they put any of their data or processes into the public cloud.
They can argue security and regulatory concerns, but running dedicated systems is an expensive proposition. The costs of maintaining the hardware, software, storage, and network infrastructure as well as the staff that will support it, only will increase as competition and economies of scale drive the unit price of their hosted counterparts to zero.
However, this isn’t an argument for virtualizing all IT infrastructure. It might be attractive to firms that are building their IT infrastructure from scratch or have a relatively small IT footprint, but it would not be an easy process for organizations with a multitude of legacy systems. They would need to do the same cost-benefit analysis as if they planned to move those systems on to any new platform, hosted or not.
The greatest argument for maintaining dedicated infrastructure is the perception that dedicated infrastructures are more secure than shared ones, but it ignores the rigid due-diligence requirements to which cloud providers already have been subjected.
Having one’s processes in the cloud doesn’t protect organizations from cyber-threats automatically.
However, of the estimated 45,000 computers infected with the WannaCrypt malware, there have been no reports of the ransomware affecting any of the major public cloud providers.
The WannaCrypt and its permutations exploited a weakness in Microsoft’s Server Message Block protocol, for which the vendor released a patch in March 2017. If a firm ran an unpatched version of SMB in a cloud environment, it would be as vulnerable as if it was running SMB on a dedicated server.
But which is more likely to happen: spinning up an unpatched virtual machine or not updating every server in a data center to the latest operating system version, including patches?
Once again, it all boils down to whether firms are willing to bear the increasing costs associated with dedicated systems.
Deploying remote servers is necessary to optimize computational complexity.
Big Data will drive more firms to the cloud in 2017.
The hosted technology continues to play a greater role in the capital markets.
Tightening IT budgets become major force behind taking computing to the cloud.