Search This Blog

Showing posts with label Trends. Show all posts
Showing posts with label Trends. Show all posts

Friday, February 22, 2008

Cloud Computing , is it a Hype or Future of Data Centers ?

I was reading the press release about IBM Blue cloud initiative, though the press release was old. After reading I start doing some research about the Cloud computing, and came across with some wonderful organization providing this service and came to know the technologies behind it. Though its bit early, it reinforces my believe that we are going towards Utility based computing. Service providers who don't have planned or a vision on this is surely going to suffer in the next couple of years.
What is cloud computing?
I have cut to paste this definition from wikipedia
“Cloud computing is a new (circa late 2007) label for the subset of grid computing that includes utility computing and other approaches to the use of shared computing resources, rather than having local servers or personal devices handling users' applications “
In the heart of cloud computing, there is virtualisation. A service provider provides a virtualized computer resource to a consumer through the network/ internet. Hosting service providers which provides virtualized images of server can be also called termed as cloud service provider. However, as they move up the value chain, we find providers with grid of thousands of hardware provides these infrastructure utility services as a web interface . Its irrespective which physical server your application is using . It will only guarantee resources like CPU, RAM, Storage. Next time you reboot your application it can get transferred from London to Hong Kong . Most of them use Mapreduce , Hadoop for processing large set of data . They break up the code into many small chunks, so that it can be distributed in parallel into thousands of computers . It all started in Google, as Mapreduce and then eventually taken into apache projects and produce Hadoop . It's an open source Java based frame work now . IBM is using Hadoop in its blue cloud initiatives.
Amazon web services EC2 elastic compute cloud services is building around Xen virtualized images. These services can get us rid of Hardware. Setup is straight forward for simple websites . But for a bit complicated infrastructure all the configuration works needs to be done, needs the assistance of system administrator. EC2 uses Amazon S3 for storage. Recently, S3 was in the news for all the wrong reason as it was down for a couple of hours. Serious corporate customer will be hesitating to use this kind of services . Yahoo, Google all planning to provide these services.

While doing some Internet research I came across with a company called 3Tera. It takes a holistic approach and provides a wonderful service. It allows the customers to create its own Virtual Private Data Center. 3Tera partnered with number of hardware service provider which provides commodity Hardware (50,000+). Using Applogic GRID opreating system it creates a layer of abstraction on those servers and creats virtual images . It has build it storage network based on commodity direct attached devices of these servers. According to 3Tera it provides the first GRID operating software for Web based application. It provides an AJAX based interface to configure the infrastructure virtually . In the demo 3Tera has shown how to create a virtual infrastructure comprises of Firewall, Load balancer , web servers , SQL server , NAS by simply doing drag and drop , as if someone is doing a Visio diagram . Actually, they are all virtual images of their cloud .Application instances got deployed on these virtual servers . Entire infrastructure is up in just ten minutes. I believe this will be the future of Infrastructure Providers. I strongly recommend you to see the demo yourself.

Wednesday, February 20, 2008

Convergence of SOA and GRID

If we want to predict the future we need to have a look, back in history. Industrial history shows that Grid Computing is inevitable. With the advent of grid based electricity all the turbines, steam engines that use to power up individual factories got replaced by external service provider for reliable source of electricity and on demand capacity . Same is bound to happen for Computational needs. Grid was around the corner for quite some time, but it was not gaining widespread corporate acceptance except for highly specialized computing intensive task like simulation or analytic works. It was used more as a high performance computing (HPC) The reason being applications were not geared up to take the advantage of GRID. Now with the rise of Service Oriented Architecture this barrier is slowly diminishing. GRID and SOA are on a convergence mode. The latest version of open source tool kit for GRID computing Globus tool kit 4.0 is based on web services. We can articulate GRID as an application which coordinates resources (CPU, Storage) between different nodes. There is a scheduler / coordinator /management Node, which schedules workload or jobs between different nodes, which uses the resources of those nodes, and gives the desired results back to the coordinator. There are a variety of methods to communicate between the Coordinator and nodes, but most system currently make use of web services. The GRID it self is currently based on SOA principle. Second earlier the enterprise applications were built with tightly coupled components and same application performing many functions. The deployment of these in a grid was a challenge. However, as service oriented architecture starting to become a main stream phenomenon, where the application is broken into components these caters to different services. Standard interfaces are built into these objects and methods to remotely access these particular services. It does not matter where these components are residing in the network. These applications can be become ideal candidates for grid based system , if the resource requirement is more then a single server can offer. With GRID it can be cost effectively spread across multiple servers , which will provide both computational power as well as eliminating the single point if failure. Amazon S3 storage service is based on web services, deployed in a grid environment. Each has its powerful advantages, SOA advantages are well known by now, and GRID creates an abstraction layer around the whole computing Infrastructure. However, in reality the adaptation level of both GRID and SOA combined in enterprises are quite low at this moment. According to one research organization with taking small steps in SOA the adaptation of GRID is high , but in organization where there are full SOA implementation , GRID adaptation is quite low . Nevertheless, the benefit of the combined will be hard to be ignored by the enterprises . GRID will pave the way for utility computing . Will SOA and GRID be poised to set a new Revolution or it'ss just another IT hype cycle . We need to wait and see ,but the concept /architecture behind the GRID will prevail . It might have gotten merged with today's much hyped CLOUD computing . I will write my next Blog about the Cloud computing .

Sunday, February 17, 2008

Infrastructure Outsourcing V2.0

The rise of on demand utility services is on the horizon . We have already seen many successful models for software as a service (SaS) . The same will be extended to traditional infrastructure services such as storage on demand and processing on demand . Amazon already started storage on demand services . Amazon through its S3 storage service solution provides 1 GB of space for 15 Cents per month . Amazon storage is accessed by a standard SOAP and REST interface and networking is handled by HTTP and bit torrent protocol . Amazon infrastructure is building on inexpensive commodity hardware , and as more nodes got added overall reliability got increases as there is no single point of failure . Reason why Amazon can provide highly reliable and cost effective services . Similarly Amazon EC2 Elastic Compute Cloud which provide entire compute power as a webservices (CPU , Memory , Storage, Network), this is still in beta .
Traditional hardware vendors are taking a different approach in this regard , as an example HP’s pay per use storage service . HP will install the storage device at the customer premise based on the need . Each month the customer will be charged for average capacity used in addition to a minimum percentage of installed capacity . Though it can't be termed as a true utility service as Amazon.
Similarly, for on demand processing . It reached a stage of maturity .The prominent example is the BNP Paribas on demand processing contract with IBM . The contract allows BNP Paribas to access the capacity of 2,500 Blade Center services with provision of double the service if required . This service is provided through IBM’s Deep Computing Capacity on Demand Centre which has upto 13,000 processors (Intel ,AMD,IBM) .
Gartner’s May 2007 poll shows growing uptake of this such utility service . A total of 27 % of 120 Client organisations are now using some form of Infrastructure Utility and 89 % expect to do so .