No one likes to be a guinea pig -- least of all the CIOs whose heads are likely to roll if they overspend even a few cents beyond their shoestring budgets. That is perhaps why everybody loves to talk about Cloud Computing but rarely take the risk to implement beyond simple virtualization or a subscription to Salesforce.com. However, acting as a “guinea pig” is sometimes beneficial because pioneers create true value by proving a concept.
As far adopting Cloud Computing, it is possible to be a guinea pig, create value for yourself (and not someone else) and still not get lost in the process. How?
The most important move is to look for the “easy wins” and to target some of the important services you conduct that are prone to unpredictable, often short-lived demand. These are the best candidates for proving that Cloud Computing works. Here are a few examples.
1. Performance and Load Testing
Performance and load testing of new applications or major releases of refurbished applications is usually a resource-intensive process within any organization, consuming time, staff, hardware and software. Provisioning such requirements, especially the computing infrastructure, on demand via a cloud architecture can save precious resources, and also serve as a useful test of the cloud's architecture.
Additionally, the fact that performance and load testing is not a mission-critical process, in a relative sense, implies that an operational failure can be countenanced. Most firms choose to test out private cloud architecture through precisely such a project.
The results have been broadly positive, especially on the cost front. Given that fact that virtualization enables testing across diverse applications and platforms, wide ranging simulations can be carried out for a fraction of the earlier cost. Performance and scalability testing done over the cloud also enables more robust simulations, since various scenarios can be constructed with the parameters of operating systems, virtualization platforms and related applications being completely flexible.
Most organizations, of any size, usually have integrated training components that need to be imparted to employees on a regular basis. Such training programs, although effective and
desirable, have the limitation of being a huge crunch on existing resources, especially with limited software licenses. An internal cloud, properly scaled and loaded with the requisite software, can easily provide for the necessary requirements in the case of trainings.
3. Business Continuity Planning
Most organizations with a complicated service chain, especially in today's globally diverse world, place a high value on business continuity planning (BCP). Downtimes for projects, processes, software applications and platforms should be as minimal as possible. A considerable amount of money is spent increasing redundancy levels to up to three times normal requirements. By utilizing reliable cloud architecture, the potential to benefit can be quite great. In case of the failure of the primary deployment, which would not be a part of the cloud, the cloud architecture component of the BCP module could serve as the immediate redundancy, seamlessly integrating with the existing set-up, on tap, to provide for a business-as-usual scenario. The cost savings are the most obvious advantage of such a scenario, along with being able to make back-ups independent of geography. Cloud architecture has become the framework of choice for companies with a large global footprint. Multiple copies of data, operating systems and applications can reside in varied locations around the globe, thereby reducing the chance of a blow-out in one area with the potential to cripple operations around the globe.
Analytics, for any organization, is always a data-driven process. Such processes require being able to store, categorize, archive and retrieve data on an ongoing basis, often by multiple teams working in different countries. Storage of such data can run into terabytes, and becomes an increasingly expensive proposition. Cloud computing can help in the storage and retrieval of this data, due to ever reducing costs of storage on the cloud.
Server downtimes, data refreshes and data archival, each of which is a huge problem on a recurring basis, can be avoided by creating multiple copies of the data, allowing for work continuity while the data are being refreshed or archived. Such clouds could be private and reside within the secure walls of an organization, or could be made public when necessary. The clouds have the advantage of providing a holistic, integrated view of the analytical work being done across the entire organization, while using existing Business Intelligence resources more effectively -- thereby providing the twin advantages of increased speed to market along with efficient cost reductions.
Taking it slow
Cloud computing remains an exciting technology with diverse applications. It is still a relatively new concept fraught with risks, but it is gaining acceptance as a cost-effective delivery mechanism for services that businesses need to consider when implementing new offerings within their companies.
A solution that more and more firms seem to be embracing is that of testing cloud architecture on a limited, trial-only basis. Value-added services such as training, analytics, virtualization and business continuity planning offer scope for such testing, leading to the potential for quick wins and with significant cost savings.
This is a trend that is likely to continue in the near future, with the potential for positive results for parties concerned.