From ON Magazine
By Beth Schultz
In April, as news trickled out of Mexico about the rise of a potentially deadly influenza strain, the world went into a panic that has yet to subside. The H1N1 virus continues afflicting folks as flu seasons ebb and flow around the globe.
Top of mind for local authorities is how to contain the virus should an outbreak occur in their jurisdictions. Meanwhile, clinicians hope that technology advances will one day make such worries obsolete by enabling early, automated detection of outbreaks.
One such advancement is cloud computing, a new style of computing in which highly scalable, dynamic, and usually virtual pools of IT resources are available as needed by an application, on demand. This computing model—made possible by the advent of sophisticated automation, provisioning, and virtualization technologies—differs drastically from traditional methods that force strict ties between an application and the underlying servers and storage systems on which it runs.
Clouds that come in many shapes
Ultimately, cloud-based infrastructures will come in many different shapes and forms, from those built specifically for an enterprise's own long-term use (private clouds) to those available to anybody who has Internet access and needs some compute or storage resources for as little as a few minutes at a time (public clouds).*
With the rise of cloud computing, John Halamka, a well-respected and widely known healthcare and IT professional, thinks clinicians' hope of quick virus detection is warranted. "Imagine the day when H1N1 detection is automated because it's passively discovered in the cloud," he says.
What Halamka envisions is healthcare information from myriad sources living on such a highly interconnected and massively scalable set of shared compute and storage resources that specialized applications running on them can automatically detect potential health emergencies in their initial stages. But these are early days for cloud computing, and much work must be done before such is possible. That is a common theme sounded by pioneering IT leaders at Beth Israel Deaconess Medical Center (BIDMC), the National Business Center (NBC), and Hill Air Force Base (AFB), who shared with ON their experiences planning, building, and managing next-generation, cloud-based infrastructures.
A cloud for better health
In Boston, cloud computing is helping BIDMC address local insurance payers' and federal electronic health records (EHR) mandates, says Halamka, who is CIO for the healthcare provider. Local payers expect the medical center's 1,500 doctors, roughly one-third of whom have offsite offices, to function as a single, integrated clinical entity. The Obama administration wants 90 percent of all clinicians and 70 percent of hospitals off of paper by 2011—and has designated $44 billion in stimulus dollars as a bit of encouragement.
"Do you think that running, say, an Oracle database instance on a storage area network under your doctor's desk is going to work very well?" Halamka asks rhetorically. "Probably not. Oracle administrators are a different breed than doctors. The only way we're going to be able to achieve these goals is to provide a cloud of software functionality and storage that makes the electronic health record as easy as opening a browser."
Although one day a medical center might be able to turn to large commercial providers for such a cloud, it doesn't have that option today—and certainly did not two years ago when BIDMC began investigating the idea, Halamka says. "So we built our own cloud."
As a first step, BIDMC selected integrated practice management and electronic health records software from eClinicalWorks. BIDMC could host the application at a central location and build out an infrastructure to support physician access via secure web connections. Physicians would not need to manage their own application and database servers—those database instances to which Halamka referred.
If you build it, will they come?
Trouble was, Halamka's IT team had no idea how many of the 400 physicians who were affiliated with BIDMC but not residing on the campus would jump on board. "We didn't want to spend $1 million building an infrastructure to support 400 physicians but only have 100 show up," says Bill Gillis, manager of clinical application services at BIDMC.
BIDMC required an IT infrastructure that would make it possible to start small and easily scale as needed, and virtualization provided the answer, Gillis says. Using VMware's server virtualization and EMC's storage area network (SAN) technologies as a foundation, BIDMC built a software-as-a-service (SaaS) cloud. Physicians simply open their web-connected tablet PCs and launch eClinicalWorks. Behind the scenes—meaning in the cloud—their application requests flow over a dynamic, flexible architecture with virtual machines spinning up and down and tapping into storage as needed.
SAN scalability has proven particularly beneficial, Gillis says. "A year into the infrastructure build, the application vendor came out with a major change that required double the amount of memory and storage for certain aspects of the application. Had we gone with a cluster, upgrading the infrastructure would have cost us about another $300,000. But with our VMware licenses and EMC disk arrays, it only cost $30,000," he recounts.
Storage that scales
Even with just a few physician practices tapping into the BIDMC cloud, storage growth is trending higher than anticipated. "We initially planned for six terabytes of storage total, but now we're projecting eight terabytes, and we'll probably end up needing more like 10 terabytes if everybody signs up," Gillis says.
BIDMC is bringing the physicians into the cloud in a phased plan that aims to have everybody on board by Dec. 31, 2010. In the meantime, Halamka says, BIDMC is reaching out to extend connectivity from the cloud to other healthcare organizations.
"One of the things you'll see coming out of the Obama administration is a real sense of urgency for sharing data—with patient consent, of course—doctor to doctor, doctor to public health agencies, to immunization agencies, to quality measurement organizations. So a lot of our work will be about building more connectivity so the data that's in the cloud can be shared, clinical care coordinated, errors reduced, and—because public health will be involved—detection automated," he says.
Clouds over Washington, D.C.
Washington policymakers certainly recognize the cloud's power in helping meet government objectives, and not just in healthcare. President Obama's fiscal year 2010 budget calls out the technology's potential for optimizing the federal data infrastructure and enabling a services orientation for any agency. Lower costs, of course, are the aim, with the federal IT budget for 2009 at a mind-boggling $74.2 billion, according to figures from 28 departments providing data for the new federal IT Dashboard. That presidential nod has federal agencies scrambling to figure out what cloud computing means for them. "The market would have gradually migrated over to accepting the cloud, but this is forcing that to happen more quickly," says Doug Bourgeois, director of the NBC, a business management systems and services provider across the Department of the Interior, to all Cabinet-level agencies, and the Department of Defense. The NBC's four biggest services lines are contract acquisitions, financial management, human resources, and IT, with a mission of better serving federal agencies through economies of scale.
"We can hardly keep up with the agency inquiries asking us to meet with technical folks about cloud capabilities," Bourgeois says. Fortuitously, the NBC had its cloud strategy in place before the administration opened the floodgates.
Being in the right place at the right time
Several years ago, the organization had implemented a highly scalable, services-oriented architecture that allowed the loose coupling of applications to a virtualized server infrastructure. When talk of cloud computing bubbled up, the NBC quickly realized that with a few more building blocks, it could transform itself from a typical hosting provider to a cloud-based infrastructure-as-a-service (IaaS) provider.
Cloud services, which are scheduled to be available this year, will transform the NBC's hosting business model. The NBC and its clients will no longer need to hash out application requirements, speeds and feeds, and memory. And clients will not need to make large upfront investments and then wait weeks for delivery and deployment of hardware and software. Rather, when clients are comfortable with how cloud computing operates, they will simply head to the NBC's cloud portal, work through a checklist of resources and capabilities, and the servers and storage needed for a particular application will be available as desired. Once clients no longer need access to that capacity, they return to the portal to spin down—and stop paying for—those resources.
Like BIDMC, the NBC is building out its cloud infrastructure on top of VMware virtual machines and tiered EMC storage. To support the cloud infrastructure, it currently has about 350 physical x86 servers, 300 Unix servers, and dual mainframes, Bourgeois says.
The NBC is offering private and hybrid cloud services for its federal agency customers. Besides the IaaS offering, called NBCGrid, plans include a metered, pay-per-gigabyte cloud storage service named NBCFiles. With the hybrid cloud service, clients will be able to tap into NBCGrid or NBCFiles to handle processing bursts. While the new model of cloud-based services is being carefully studied by the NBC's customers, they're keen on the idea of flexible, immediate, pay-as-you-go IT, Bourgeois says.
Taking off with cloud
Agile, quick, on demand—those are indeed the hallmarks, and much-desired characteristics, of a cloud infrastructure, agrees Douglas Babb, chief IT systems architect at Hill Air Force Base (AFB) in Utah and contractor with Systems Implementers, Inc. Babb is overseeing a five-year, five-tiered plan, called Project Bonfire, that has seen Hill AFB migrate from a monolithic, proprietary, static, and costly mainframe environment to a state-of-the-art, open, dynamic, and cost-effective cloud architecture.
This internal cloud provides not only software and infrastructure as a service but also platform-as-a-service (PaaS), primarily intended for developers who need capacity but not much else, Babb says. "All these are to provide very quickly a return on investment, a return on information, to the customers. We're trying to reduce the time to value," he adds.
As a first step, Hill AFB re-hosted all of its mainframe applications on a computing grid built on new x86 servers running the open-systems-based Red Hat Linux. Babb is careful to explain the difference between grid and cloud computing, which many often confuse.
"A cloud is far more than just a grid," he says. "A cloud typically has a self-service or automation aspect to it. It's scalable. In a cloud, applications are contained, and you have programmatic control and hardware abstraction with multiple applications inside the same grid. And you have some way of determining what an application is actually consuming. That's where the real difference between cloud computing and grid—and the time to value—lies."
Ensuring a methodical migration
Hill AFB's migration to the cloud has been purposefully methodical. The idea of cloud computing is great, Babb says, but "you can't build a roof if you don't have walls. You can't build walls if you don't have a foundation. So we had to build from the ground floor by consolidating data centers, applications, and servers from across the base."
For Project Bonfire, then, Babb and his team initially focused on implementing the SANs and the server infrastructure for the move to cloud computing. "Then we went on to the Oracle environments, virtualization, cloud storage, and information lifecycle management, which combines all of those," he explains.
"What's really interesting is the new storage includes replication, versioning, compression, deduplication, and even spin down, all inside the cloud itself. The cloud will route information to the right tier and what's closest to the customer and most cost effective for that purpose—based on business policies or on how many copies are needed for business continuity and disaster recovery.
"All of this is inside the cloud, transparent to the application," says Babb, noting that an equivalent scenario is taking place on the server side.
Hill AFB uses EMC storage across its four tiers, including the new Atmos cloud-optimized storage, Babb says. For server virtualization, it uses the latest VMware technology. Hill AFB currently hosts 390 applications running on 192 vSphere virtual machines, adds Babb, noting that the number of virtual servers in this infrastructure grew at one percent per day last year.
Standardize, streamline, simplify
The overriding goal is to standardize, streamline, and simplify. "We didn't want to take a lot of little messes and create one big mess," Babb says. "A consolidation effort that doesn't standardize the process, systems, and tools used to support service delivery will have a difficult time reducing total cost of ownership and improving quality of services."
And Hill AFB customers have been pleased, notes Michele Neri, deputy director and CTO at the base. "They have seen a 300 to 400 percent improvement in response rates on some applications as they go through this process," he says. "The performance improvements plus reliability are bringing them to talk to us about consolidating rather than us having to go to them."
Cool technology or not, that's really what Project Bonfire has been all about—improving the business experience, Babb says. "Bottom line, we needed quicker business results, reliable service, and efficient use of our resources."
What sets the cloud apart
The experiences of early adopters of cloud computing such as BIDMC, the NBC, and Hill AFB highlight the important differences between where IT has been and where it is going:
- A cloud is built differently from traditional IT environments—it uses flexible pools of resources, rather than fixed allocations.
- A cloud is operated differently from traditional IT environments—users are put in charge of IT resources using low- or zero-touch operational models.
- A cloud is consumed differently—usually, there's a chargeback model that's convenient for the end users, such as metered billing.
These early adopters also prove that the compelling benefits of cloud computing guarantee that it will become the dominant model of IT deployment in the future.