【Federal Cloud関連サイト】

  • GSA_RFQ GSAの発行したCloud storefront RFQ


GSA Outlines U.S. Government's Cloud Computing Requirements

A newly issued RFQ details what's expected from cloud computing vendors in terms of security, SLAs, geographic location, and data ownership.

By J. Nicholas Hoover, Information_Week, 8/ 3, 2009 05:03

The General Services Administration has issued a Request For Quotation for cloud storage, Web hosting, and virtual machine services. The document is a preliminary step toward an online "storefront" to be used by federal agencies for ordering cloud computing services.

The RFQ includes ground rules for being a cloud service provider to the U.S. government. Federal agencies will retain ownership of data and applications hosted online, and they can request full copies of data or apps at any time. In addition, cloud services are to be multi-tenant in architecture, be able to be provisioned securely and remotely, scale elastically, reside within the continental United States, and provide visibility into resource usage.

3.2 million Euros awarded across eight prize categorie recognizing world-class scientific research and artistic creation. Mark Lewis, President, Content Management & Archiving Division at EMC and Andrew Conry-Murray, editor at Information_Week discuss EMC's content management strategy, new product rollouts, and the role of search in content management. Google announced a browser-based operating system (Chrome OS) that will run on Netbooks. Information_Week's Alex Wolfe and Fritz Nelson discuss its impact and feasibility.

3.2 million Euros awarded across eight prize categorie recognizing world-class scientific research and artistic creation. According to the RFQ, cloud service level agreements must provide for at least 99.95% availability, vendors have to take steps to secure their services, and trouble tickets and order management need to be able to be done via API. Virtual machine services must allow live migration of workloads from one VM to another, while Web hosting services require both Windows and Linux options.

The RFQ includes a graphic of the Federal Cloud Computing Framework, which breaks cloud computing into cloud services, user tools, and core foundational capabilities. Sub-categories are software-as-a-service, reporting and analytics, and functional categories such as e-mail and order management. The RFQ is available on Scribd.

The federal government is in the early stages of adopting private and public clouds as a way of making government IT more efficient. Federal CIO Vivek Kundra has been a proponent of the cloud model, with support from the General Services Administration, the Cloud Computing Executive Steering Committee, and the Cloud Computing Advisory Council. Earlier this year, GSA issued a request for information for infrastructure as a service.

There's still confusion among government technology professionals about cloud computing. A recent MeriTalk? survey found that "IT managers do not share a common understanding of the technology." Almost half say they are still learning about what cloud computing is and how it works, and predict that it will take two-and-a-half years to realize the benefits of cloud computing.

Information_Week has published an in-depth report on leading-edge government IT -- and how the technology involved may end up inside your business. Download the report here (registration required).

Discuss This / 2 message(s). Last at: Aug 4, 2009 1:01:00 PM

commented on Aug 4, 2009 1:01:00 PM

With government pumping hundreds of billions into the U.S. economy this year and last in an attempt to save the country from total economic disaster, more and more people are asking where and how that money is being spent. An article in the Washington Post recently outlines the problem the Obama administration "tech-savvy as it is," according to Rep. Brad Miller (D-N.C.), will have ensuring transparency.

The promise of transparency does not end with tracking the spending of economic stimulus dollars, however. Many people I've talked to in the federal government tell me it's also a priority on the technology side for the administration and every department in the executive branch.

An example: a security officer at a major governmental organization is responsible for securing 10,000 mobile devices being used by people in many different offices across the country. However, each office set its own policy on the use of the mobile device and is responsible for making sure their employees are complying. The security officer has no view into how they are being used, what software is being downloaded, and what security risks they pose. The top secret data on just one of those devices could be a major national security threat. But he can't see them - he has no visibility or transparency.

So what are those responsible for security, compliance, and technology in the government to do? One option that has been thrown around is for one of the major telecoms or ISPs to build the government its own private cloud. However, that fails to deliver on everything that cloud computing promises - low cost, quick deployment, ease of use. The economic stimulus dollars that would be necessary for this sort of project might be better spent building roads and bridges. And the timing - well, Obama's first term might be long gone by the time it's up and running.

A better option would be to leverage existing, highly secure, cloud based Managed Services. That's what we've been able to do with our clients on the corporate side, which demand IT security as strong or stronger than the government. This sort of thing costs little compared to the total cost of ownership of the government's existing LAN operations or a private cloud. More than that, it also delivers on the other goals of the economic stimulus plan - it's efficient, environmentally friendly, secure, and transparent.

If the government wants to ensure visibility through economic stimulus - achieving it for the public and for their employees - the most advanced and promising technologies have to be leveraged. And for an administration that has embraced web-based communication already, this should be an easy move to make.

     Jeff Ward,

Healthcare Reform Battle Takes To Social Media

Posted by Mitch Wagner on August 7, 2009 01:06 PM

President Obama is bringing out the tools that got him elected--YouTube? and other social media--in an effort to win support for his proposed healthcare reforms. The White House is using grass-roots Internet activism in an effort to gain control over the healthcare debate, as public skepticism of the plan mounts.

White House officials have begun a two-pronged Internet campaign, geared toward reenergizing Web-savvy allies who backed Obama last year and whose support will be critical in getting the health-care initiative through Congress. Meanwhile, the president tried to rally Senate Democrats over a seafood lunch Tuesday in the State Dining Room.

The new engagement by the White House comes at a time when Democratic lawmakers are fielding attacks on talk radio, in cyberspace and at appearances in their home districts.

In a new blog and video titled "Facts Are Stubborn Things," White House aides detail "disinformation" and "very deceiving headlines out there" on health-care reform. The message, e-mailed to tens of thousands of supporters, also encourages viewers to report anything "fishy."

The White House efforts includes a series of YouTube? videos on the White House Blog.

Conservatives, of course, have many objections to the Obama plan. But one in particular leaps out as appropriate for discussion here. The White House Blog writes:

There is a lot of disinformation about health insurance reform out there, spanning from control of personal finances to end of life care. These rumors often travel just below the surface via chain emails or through casual conversation. Since we can't keep track of all of them here at the White House, we're asking for your help. If you get an email or see something on the web about health insurance reform that seems fishy, send it to

Texas Republican Sen. John Conryn was critical:

Texas Republican Sen. John Cornyn is taking issue with a Tuesday posting on the official White House blog in which the Obama administration asks supporters to report back when they receive "an email or see something on the web about health insurance reform that seems fishy" to an official e-mail address:

"I am not aware of any precedent for a president asking American citizens to report their fellow citizens to the White House for pure political speech that is deemed 'fishy' or otherwise inimical to the White House's political interests," Cornyn writes today in a harshly worded letter to President Barack Obama in which he asks the president to immediately halt the effort.

The blog adds that the White House says it's "not compiling lists or sources of information"; it just wants to know what's being said so it can refute claims it believes to be false.

What do you think? Is the White House simply trying to build support for its proposal, or have they crossed the line into intimidating political opponents? Is it appropriate for the White House to ask citizens to use social media to forward opposition arguments?

Information_Week's Informed CIO series lays out 10 steps to achieve excellence in service assurance initiatives. Download the report here (registration required).

Microsoft's Drag-And-Drop Windows Azure Cloud

Posted by John Foley @ 09:45:AM | Aug, 7, 2009

Citing an unfavorable change in tax laws, Microsoft is moving its Windows Azure cloud from a data center in Washington state to one in Texas. It's an interesting new twist in the cloud computing market-moving a cloud across state lines in response to the regulatory climate.

In a blog post on its Windows Azure site on Tuesday, Microsoft revealed that it's moving Azure-based apps from its Northwest region, citing "a change in local tax laws." According to an article on DataCenterKnowledge?.com, Microsoft faces a 7.9% tax on new data center gear, following a period in which it apparently paid no such tax. In February, reported that a tax exemption could be worth more than $1 billion to Microsoft and other data center operators in Washington.

Michael Manos, the former GM of Microsoft data center services and now senior VP with data center specialist Digital Realty Trust, suggests that we're witnessing the beginning of "a cat and mouse game that will last for some time on a global basis." In a blog post, Manos provides examples of how government regulation can affect data center location and cost, and he makes the point that laws and regulations can change after a data center gets built. What then?

The answer is to move the data center or, in the case of Microsoft, to move the services that represent the data center's biggest area of growth, its Windows Azure cloud services. (Azure is due for availability in November.) "The 'cloud' will ultimately need to be mobile in its design," Manos writes.

Moving the data center itself would be an extreme measure, though within the realm of possibility. While at Microsoft, Manos helped conceive a blueprint for modular data centers that can be assembled like Lego blocks and, presumably, disassembled and reassembled.

A more feasible approach, and the one that Microsoft appears to be taking, is to move the cloud computing software layer and associated data, while leaving the physical servers in place. Manos uses the metaphor of the data center as a "folder." If data center operators were able to drag and drop clouds from one folder (data center) to another, it would make them less vulnerable to changing conditions, including laws, taxes, customer demographics, even wars and natural disasters.

Cloud architects and cloud users have already begun to think in terms of geographic location, as evidenced by Amazon Web Services' "regions" and "availability zones." And Microsoft gives early adopters of Azure a degree of control over where their apps and data are stored. Given Microsoft's impending cloud shift, Azure users are being instructed to disable the "USA-Northwest" option for any new applications they build. In migrating Azure to a different data center, Microsoft is taking the concept of the movable cloud to the next level.

Want to see what a cloud data center looks like? Here's our image gallery of Microsoft's San Antonio data center, the new home of the Azure cloud. The photos were taken by Information_Week's Nick Hoover last year on a walk-through of the facility.

To keep applications humming in virtualized environments, you must move beyond manual monitoring and management. Find out about that and more in our digital supplement on virtualization and the cloud, part of Information_Week's Green Initiative to reduce our carbon footprint.. Download the supplement here (registration required).

NIST Lab Director Tackles Cybersecurity, Cloud Computing

By J. Nicholas Hoover / Information_Week 8月7日, 2009 11:48 HM

Cita Furlani explains the nuts-and-bolts work of defining key government IT standards and the job of working with federal agencies on adoption and implementation.

The National Institute of Standards and Technology's IT Laboratory plays a key role in government cybersecurity, setting standards that federal agencies are required to follow. Information_Week discussed NIST's role, including the fine line between setting standards and setting policy, with Cita Furlani, director of NIST's IT Lab.

Information_Week: How would you describe NIST's cybersecurity role, and how NIST influences what federal CIOs and IT professionals implement?

It's a Recession, it's official so the economists tell us. The Tech Sector has seen it all before and knows how to cope with the downturn in the economy.

Furlani: We have the mandate from Congress under the Federal Information Systems Management Act that we develop standards, and once they become a Federal Information Processing Standard, agencies have the requirement of actually using the standards. Mostly we limit our FIPS development to very core technologies. The encryption modules and the Personal Identity Verification standards are the most recent, the most visible at least. Most of the rest of what we do is really considered guidelines; it's not mandated.

Information_Week: How do you work with the federal IT crowd? They must say, 'How do we actually implement this stuff?' Do you get peppered with a lot of questions?

Furlani: Oh yes. We have a large outreach effort. The staff is out with these research activities, they're out with CIO Council. We publish everything first as a draft publication for public comment from government agencies as well as anybody else. Sometimes some of that is put out for a second draft when you get enough comments back. When we are publishing FIPS, we make available every public comment and every response to a public comment.

Information_Week: When a FIPS document goes out, after the FIPS 140-2 encryption standard got released, for example, a slew of vendors say, 'Our USB key is encrypted to 140-2 compliance.'

Furlani: We have a certification program in place under our sister laboratory, the Technology and Services Laboratory, the National Voluntary and Accredited Laboratory Program. There are accredited labs that certify whether a particular piece of software meets the crypto requirements, and then those are published on our Web site.

Information_Week: What about recommended actions? You've recently put out a final document called Special Document 800-53 for recommended security controls for federal information systems, for example.

Furlani: The way 800-53 is designed, you need to understand what level of risk you are taking before you understand what level of controls you're going to implement. It's like locking your house. You can lock everything down with double locks and everything else if there's something in some room you really want to protect, but typically because you want to go in and out more easily, you don't protect your house at the level you could. What we've tried to do is give system managers that trade-off for understanding what mechanisms to use. If you've got a low risk system, you can choose from among this set, if you've got a high risk system, you can chose among an additional set.

Information_Week: Why is 800-53 an important publication?

Furlani: Primarily because it's so needed to understand why you're making these decisions. Another incredibly important part is that we do not have a mandate for the intelligence community, but they are engaged and helped define what the goals are, as well as the Department of Defense. So really for the first time, we have a baseline set of controls across the entire federal sweep of agencies, by voluntarily agreeing what those should be.

Defining The Cloud

(Page 2 of 2) 8 7, 2009 11:48 HM

Information_Week: How applicable is current FISMA guidance to cloud computing?

Furlani: A lot of what needs to be clear is a good definition of what we're talking about before we can start saying how you might protect it. Then, looking at the trade-offs between security and privacy, usability, how you can scale identity management; there are many research-related issues here. How do you measure a truly secure system, and against what risk levels are you trying to measure?

It's a Recession, it's official so the economists tell us. The Tech Sector has seen it all before and knows how to cope with the downturn in the economy. Google announced a browser-based operating system (Chrome OS) that will run on Netbooks. Information_Week's Alex Wolfe and Fritz Nelson discuss its impact and feasibility. An Enterprise 2.0 tour about SAP's Business Intelligence deal with Jive Software, which lets you embed dynamic analytics BusinessObjects? Crystal Reports widgets inside your ClearSpace? blog posts. It's a Recession, it's official so the economists tell us. The Tech Sector has seen it all before and knows how to cope with the downturn in the economy. Information_Week: In an area like cloud computing, which is amorphous and widely defined and still in a developmental phase, how does that play out? How do you pick and choose your battles of what to define as standards now versus what to do later?

Furlani: It can and will be an enormous savings if we can figure out how to do it correctly, and I think that's what we're all struggling with, both Vivek Kundra, GSA, and the CIOs in general. I co-chair a subcommittee of the CIO Council that helps identify some of these ongoing technology infrastructure constraints; it's the subcommittee under the Architecture and Information Committee, the Technology Infrastructure Committee. We actually have the change, control responsibilities for the Federal Desktop Core Configuration and IPv6, so I'm in the thick of thinking about what the government should do about cloud computing, both with NIST and with the CIO architectural control community. Actually defining what cloud computing is is number one, and number two is figuring out where there should be standards.

Information_Week: While the government has been pushing IPv6 for some time, standardization has begun to take final shape, and there are requirements that soon everything new has to be IPv6. How do you get the testing program ramped up?

Furlani: We put out the government profile of what the government should expect from IPv6 and what should be measured. With the labs, we're setting up accreditation programs that we can do the same kinds of work, with the University of New Hampshire doing the bulk of the work. We'll work with OMB to make sure what exactly a requirement clause would say that makes these requirements imperative in the acquisition process.

Information_Week: What's going to be your role in taking what was developed in the 60-day federal cybersecurity review and implementing it? I wanted to drill down into one area where there's a lot of work to be done in standardization, which is identity management.

Furlani: In the big picture, portions of the Comprehensive National Cybersecurity Initiative were focused on the research direction. Another piece we bring in is the whole understanding of standards and their development and the recognition internationally. IT is global, so if you're talking about DNSSEC or Internet connectivity or use, building the standards internationally, understanding those standards, and bringing that back into the community is important.

In identity management, it's not just the context of IT management. I need to understand you are you, but I also need to understand if the computer is yours or it's somebody else's. Being 'you' may mean something entirely different in your work life and your private life. Bridging these trust models, some kind of federated credentialing, understanding scalability issues. Role-based access comes into play. There's a lot of research there, and there's also a lot of moving the standards forward

ID Management Remains Challenge For Federal Agencies

By J. Nicholas Hoover / Information_Week / 8月5日, 2009 09:58 HM

Some of the hurdles faced by the U.S. government include funding, organizational structure, and data protection.

Despite numerous federal mandates such as Homeland Security Presidential Directive 12, identity management remains a challenge for government agencies, according to federal cybersecurity and IT officials.

"It's a challenge that it is an unfunded mandate," said Ken Calabrese, CTO of the Department of Health and Human Services, during a panel discussion on identity management in Washington. "There are some straight dollar benefits, but it's just something that doesn't resonate with people at this point. Articulating business value is a tough challenge."

It's a Recession, it's official so the economists tell us. The Tech Sector has seen it all before and knows how to cope with the downturn in the economy.

Over the last few months, the Department of Health and Human Services has conducted a review of its identity management efforts, concluding that the agency needs more coordinated control over logical and physical security. HHS is carrying out a pilot program with that in mind. Now, when someone hands in their office keys upon leaving a job, their ID is automatically deleted from all systems. The agency is also testing single sign-on capabilities.

There are areas where HHS still needs to fill in the blanks. For example, it needs to study ways to more thoroughly prove identity in online communications between doctors and patients involving electronic health records, especially at the nationwide scale the White House has proposed for e-health records, Calabrese said.

NASA faces a different challenge. It's not that there are too many people logging into NASA systems, but that there are so many constituent groups. The space agency is revamping its network architecture to create "zones" that communities of interest can access, similar to role-based access. There might be one zone for NASA workers, another for universities, and another for foreign space agencies.

Still, identity and access management is no panacea, said Jerry Davis, NASA's deputy CIO for IT security. "The problem isn't just knowing who's on the network," he said. "When you have someone who's connected to the network and they're a true person, what are they connecting to the network with? Two-factor authentication isn't going to help if you've got a bug that does keylogging or whatever."

At the Internal Revenue Service, the organizational structure poses challenges in coming up with a comprehensive identity management and authentication policy. The IRS splits cybersecurity into two groups, operations and policy, and neither has control over physical access.

Based on data it stores on household income, the IRS is being considered for a role in verifying income for lenders and home buyers, according to Andrew Hartridge, IRS' director for cybersecurity policy and programs. The IRS could leverage its data to create challenge questions as a way of authenticating home buyers, but Hartridge said that would require significant adjustments to prevent disclosure of information to someone with unauthorized access.

The Federal Aviation Administration, meanwhile, has sorted out who manages physical access, who manages logical access, and who verifies identities, and it uses a portable "card mobile" to distribute smart cards with digital credentials to employees at various FAA locations. Yet, FAA CIO Dave Bowen echoed some of the problems other agencies face, including the lack of funding for HSPD-12 and the difficulty in convincing line of business managers of the importance of strong identity management measures.

There's a big buzz surrounding Government 2.0 -- the revolution that's bringing the principles and value of the Web as a platform to the business of governing. Attend Gov 2.0 Expo Showcase and hear innovators show how this is really happening. At the Washington Convention Center, Sept. 8. Find out more and register.

Inside Terremark's Secure Government Data Center

By J. Nicholas Hoover / Information_Week / 7月28日, 2009 02:01 H?

Terremark's new cloud-computing facility for U.S. government customers features concrete walls, bomb-sniffing dogs, motion-sensing videocams, and a promise of 100% service availability.

As U.S. government agencies consider the move to cloud computing, where will their clouds be hosted? Terremark, a hosted service provider to U.S. government customers, has built a data center specifically for that purpose.

Terremark's data center in Culpeper, Va., is designed with the high-security requirements of government agencies in mind. The facility, which opened last year, features blast-proof exterior walls, traditional co-located servers, and a glass-enclosed room of multitenant servers that are shared by customers. Department of Defense-approved fences and 12-foot berms line the perimeter. More than 250 motion-sensor cameras send video feeds to the security office.

It's a Recession, it's official so the economists tell us. The Tech Sector has seen it all before and knows how to cope with the downturn in the economy.

"This building is like a secure bunker, and the campus is like a military base," Terremark senior VP Norm Laudermilch said during a tour of the location this week.

The federal government's and are among the Web sites now running on Terremark's infrastructure-as-a-service offering, called Enterprise Cloud. A number of other agencies are soon expected to announce their own plans to host applications on the service. Terremark also serves companies in the private sector.

As with Amazon (NSDQ: AMZN) Web Services, customers of Terremark's Enterprise Cloud pay for CPUs, storage, and memory on a flexible per-use basis. The service can support basically any application that runs on Windows or Linux.

To deliver its Enterprise Cloud services, Terremark employs VMware hypervisors and management tools. In addition, Terremark developed its own tools and wrote custom code to round out its data center management platform. Terremark is rolling up network security data, customer reporting, and ticket generation and management up into an integrated tool. The company also developed its own storage and server management tools.

Terremark backs up its services with a 100% up-time SLA. The company has rows of flywheels -- 660-lb metal discs that spin at 7,700 RPM -- to keep the power running briefly in the event of an outage. Those are backed up by massive, diesel-powered Caterpillar generators that will kick in 7 seconds later.

Terremark's facility has a 24x7 security operations center with internally developed intrusion prevention and intrusion detection capabilities. FISMA auditors recently spent more than two weeks checking out Terremark's systems. The company is one of the few co-location vendors accredited by the General Services Administration to handle classified data.

Terremark has completed one data center on the Culpeper campus. A second facility is scheduled to open next year at the same location, and three other data centers are planned there. Terremark originally budgeted $250 million to construct the site, but the company will likely need to raise that estimate.

One vehicle entry point is manned with armed guards, wedge barriers, and reinforced gates to prevent unauthorized entry. Another, for delivery trucks, requires vehicles to pull through a set of garage doors, to be searched before going through a second set of doors. The undersides of trucks are checked with mirrors, while specially trained dogs sniff for bombs.

Customers with special security requirements can get 9-gage steel mesh installed under the data center's raised floors as an added precaution, and those housing classified or sensitive information don't just have metal cages around computer equipment but floor-to-ceiling concrete walls with locked doors.

Demand for Terremark's Culpeper facility is strong, with up to 40 tours going through the facility on a given day. Terremark anticipated that it would take 18 to 24 months to sell out co-location space in the first data center, but it did so within 5 months, said Laudermilch.

There's a big buzz surrounding Government 2.0 -- the revolution that's bringing the principles and value of the Web as a platform to the business of governing. Attend Gov 2.0 Expo Showcase and hear innovators show how this is really happening. At the Washington Convention Center, Sept. 8. Find out more and register.

NSA Using Cloud Model For Intelligence Sharing

By J. Nicholas Hoover / Information_Week / 7月21日, 2009 09:55 HM

New system will run the Hadoop file system on commodity servers and include search, discovery, and analysis tools.

The National Security Agency is taking a cloud computing approach in developing a new collaborative intelligence gathering system that will link disparate intelligence databases.

The system, currently in testing, will be geographically distributed in data centers around the country, and it will hold "essentially every kind of data there is," said Randy Garrett, director of technology for NSA's integrated intelligence program, at a cloud computing symposium last week at the National Defense University's Information Resources Management College.

The system will house streaming data, unstructured text, large files, and other forms of intelligence data. Analysts will be able to add metadata and tags that, among other things, designate how securely information is to be handled and how widely it gets disseminated. For end users, the system will come with search, discovery, collaboration, correlation, and analysis tools.

The intelligence agency is using the Hadoop file system, an implementation of Google (NSDQ: GOOG)'s MapReduce? parallel processing system, to make it easier to "rapidly reconfigure data" and for Hadoop's ability to scale.

The NSA's decision to use cloud computing technologies wasn't about cutting costs or seeking innovation for innovation's sake; rather, cloud computing was seen as a way to enable new scenarios and unprecedented scalability, Garrett said. "The object is to do things that were essentially impossible before," he said.

NSA's challenge has been to provide vast amounts of real-time data gathered from intelligence agencies, military branches, and other sources of intelligence to authorized users based on different access privileges. Federal agencies have their own systems for sharing information, but many remain disconnected, while community-wide systems like Intellipedia require significant user input to be helpful.

The NSA effort is part of Intelligence Community Directive 501, an effort to overhaul intelligence sharing proposed under the Bush administration. Current director of national intelligence Dennis Blair has promised that intelligence sharing will remain a priority.

"The legacy systems must be modernized and consolidated to allow for data to actually be shared across an enterprise, and the organizations that collect intelligence must be trained and incentivized to distribute it widely," he said in response to questions from the Senate prior to his confirmation.

The new system will run on commodity hardware and "largely" on commercial software, Garrett said. The NSA will manage the arrayed servers as a pool of resources rather than as individual machines.

Information_Week has published an in-depth report on leading-edge government IT -- and how the technology involved may end up inside your business. Download the report here (registration required)

Google Apps Contract In LA Hits Security Headwind

By Thomas Claburn / Information_Week / 7月20日, 2009 06:46 H?

The City of Los Angeles faces worries about privacy and security as it considers moving to Google Apps.

Los Angeles Mayor Antonio Villaraigosa was asked on Friday to weigh potential privacy and security risks before allowing the City of Los Angeles to ditch Microsoft (NSDQ: MSFT) and Novell (NSDQ: NOVL) for Google (NSDQ: GOOG).

In a letter to the Mayor, Pam Dixon, executive director of the World Privacy Forum, urges "a thorough analysis and risk assessment of all privacy and other confidentiality impacts that may occur" if the City of Los Angeles goes through with its proposal to replace its e-mail system and office applications with Google Apps.

"Our concern is that the transfer of so many City records to a cloud computing provider may threaten the privacy rights of City residents, undermine the security of other sensitive information, violate both state and federal laws, and potentially damage vital City legal and other interests," Dixon wrote.

Dixon's letter follows a week of handwringing about Web security after a hacker who broke into Twitter in May began distributing the company's internal documents to media sites and bloggers.

The 7,250,000 contract under consideration in Los Angeles is expected to cost 8,312,410 over five years and to save the city GroupWise? license fees over that same period. The City expects to realize an additional " alt="6,253,373 in Microsoft Office and Novell GroupWise? license fees over that same period. The City expects to realize an additional "/>7,528,324 in "soft savings," through shifting IT personal devoted to to other purposes. It's also considering reducing its software license expenses through layoffs.

About 1,507,209 of the potential funding for the Google transition is expected to come from Los Angeles' portion of a 70 million class action antitrust settlement with Microsoft that occurred in June, 2006.

A report on the proposed contract from the Office of the City administrative Officer states that Microsoft Office is generally considered to have a larger feature set that Google Apps and that because some City employees rely on those features, about 20% of the City's computer users are expected to continue using Microsoft Office indefinitely.

The report assumes that about 30,000 workers will transition to Google Apps. The Los Angeles Police Department, however, is still evaluating the security and privacy implications of the proposed change. If the LAPD chooses not to make the change, only 17,000 City employees are expected to shift to Google Apps.

(Page 2 of 2) 7 20, 2009 06:46 H?

Google (NSDQ: GOOG) says that government agencies on all levels are considering cloud computing as an option. "We're excited that Los Angeles is considering joining other cities like Washington, D.C. and Seattle who have chosen cloud computing for their technology solutions," a spokesperson said in an e-mail. "Hosted software is designed to be extremely reliable, safe, and secure."

According to the City's report, the City's Information Technology Agency (ITA) "has stated that the level of security for City data will be higher under the proposed contract than is currently the case."

Nonetheless, the LAPD and the City Attorney have a considerable amount of highly sensitive information. The ITA believes that document confidentiality can be maintained through enterprise encryption options available through Google.

Rick Gordon, managing director of the Civitas Group, a security consulting firm, has concerns about lack of a reliable audit standard, data lock-in, and Google's opacity regarding its internal data security procedures. He characterizes the City report as "hand-waving at its worst."

"While the City will audit the service provider, neither has articulated a reliable standard to which the provider will be audited," he said in an e-mail. "More troubling is that LA will rely on the contract winner to help define a security standard -- an incestuous practice. Rather than having the provider define the security itself, the City should be looking to established third-party standards that hold the provider accountable to a reasonable level of security."

Concerns about lock-in, however, aren't exclusive to Google. According to the City's report, the LAPD currently uses 1,200 Microsoft Access databases and until Google offers an alternative that can import the data from Access, the City will have to continuing to support Access.

Chances are, however, security and privacy worries will be addressed to the satisfaction of most stakeholders. The Obama administration is specifically promoting cloud computing as a major government initiative to save money and better serve users of government systems.

With Seattle and Washington, D.C. already committed to cloud computing, and Los Angeles on the verge of doing the same, further federal, state, and local conversions to the cloud seem likely to follow.

Information_Week and DarkReading?.com have published a report on data-centric protection. Download the report here (registration required).

GAO: Federal Grants Web Site Shows 'Weaknesses'

By J. Nicholas Hoover / Information_Week / 7月17日, 2009 02:04 H?

Poor usability and mismanagement plague, which gives access to $500 billion in annual grants, Government Accountability Office survey finds., a federal government Website designed to give citizens access to $500 billion in annual government grants, is mismanaged and difficult to use, the Government Accountability Office said in a report released this week.

The study, which included a survey of users, found that applicants consistently run into difficulty using the Website, sometimes so much so that their grant submissions come in past their due dates. For example, prospective site users found registration difficult -- rather than being instantaneous, it takes up to two weeks -- and the site lacks even a standardized grant application process. is funded from donations from 26 member agencies of the Grants Executive Board, but the GAO found that board was not paying in time, which caused project managers earlier this year to warn of an impending shutdown, hold off on site updates, and postpone payment to vendors.

The GAO also found that, currently administered by the Health and Human Services Department, needed better oversight. There are no written policies for coordination between HHS and the 26 member agencies of the Grants Executive Board, for example.

The Bush administration launched as part of its larger e-government initiative in 2002, and since then the GAO has criticized the progression of multiple times. Use of the site has skyrocketed in recent years, with applications jumping by 56% in just the last year alone, making its usability and stability of the site increasingly critical. Half of all late grant applicants in a GAO survey cited Website performance as a key reason their grants were late.

Earlier this year, OMB director Peter Orszag predicted significant increases to traffic in the wake of the economic stimulus package, and ordered HHS to take steps to mitigate any problems from that new traffic.

Despite improvements in February and April, visitors continued to experience problems. Additional improvements to the site were made in May, including improved log-in performance and user interface changes on grant application pages.

(Page 2 of 2) 7 17, 2009 02:04 H?

In its report, the GAO recommended HHS institute better performance management measures with help from OMB, develop new and more thorough guidance for agencies carrying out various roles with respect to the site, find ways to get input from users, and implement standardized government-wide grant processing processes. In a letter, HHS concurred with those findings, but future steps remain unclear.

In a Webcast this week, representatives noted future changes to the site, including improved network performance, new capacity, and increased system monitoring as part of a project known as "Boost."

In response to the report, Sens. Joseph Lieberman, I-Conn., and George Voinovich, R-Ohio, pushed for an overhaul of, and urged the House of Representatives to adopt and pass a bill similar to one passed in the Senate in March that would place a new regulatory regime on

"I am disappointed that has not received adequate support and attention, which led to the Web site's recent difficulties handling increased volumes of grant applicants," Lieberman said in a statement. "OMB must strengthen both the management and technology behind, while streamlining and increasing transparency of the grant process."

That Senators' bill, the Federal Financial Assistance Management Improvement Act, would reauthorize and update a similar bill passed in 1999, and would make maintenance of a law.

The OMB would be required to maintain a Website that would let applicants search for, apply for, manage, track and report on grants, and would also have to report biennially on the Website's progress and submit a strategic plan to Congress.

Information_Week has published an in-depth report on leading-edge government IT -- and how the technology involved may end up inside your business. Download the report here (registration required).

Groups Seek Cloud Computing Standards

By J. Nicholas Hoover / Information_Week / 7月15日, 2009 01:15 H?

A number of bodies jointly seek standardization in areas including security, management frameworks, data exchange formats, and cloud taxonomies and reference models.

A group of standards bodies and industry groups has joined forces to collaborate on a strategy behind future cloud computing standardization efforts, the Object Management Group announced this week.

"Rather than one-by-one agreements and developing hundreds of standards that overlap, we're working together," Richard Soley, chairman of and CEO of the Object Management Group said Wednesday during a panel discussion at the National Defense University Information Resources Management College's Cloud Computing Symposium.

The group of standards bodies, called "the Cloud Standards Coordination Working Group," includes the Organization for the Advancement of Structured Information Standards, Object Management Group, Distributed Management Task Force, Storage and Network Industry Association, Open Grid Forum, Cloud Security Alliance, Open Cloud Consortium, Cloud Computing Interoperability Forum and the TM Forum.

The body is looking at standardization in a number of specific areas, including security, interfaces to infrastructure-as-a-service, information about deployment such as resource and component descriptions, management frameworks, data exchange formats and cloud taxonomies and reference models.

The form and scope of those standards is to be determined, and Soley said the groups are looking for much more input from both users and industry. "Standards don't work without heavy participation by prospective end users of those standards," he said. To help facilitate that process, the bodies have set up a wiki to allow community and customer participation in determining the best paths for standards development.

Community participation, deliberate action, and planning must be a vital part of any successful standards process, Gartner VP David Cearley said during the same panel conversation. Otherwise, he said, cloud standards efforts could fail miserably.

"Standards is one of those things that could absolutely strangle and kill everything we want to do in cloud computing if we do it wrong," he said. "We need to make sure that as were approaching standards, we're approaching standards more as they were approached in the broader internet, just in time."

Earlier this year, a group of companies and organizations created the Cloud Computing Manifesto, a group that quickly became an object lesson on the potential pratfalls of standards efforts, as several key companies, includingMicrosoft (NSDQ: MSFT), Google, and Amazon, decided not to participate.

Information_Week has published an in-depth report on leading-edge government IT -- and how the technology involved may end up inside your business. Download the report here (registration required).

GSA Inches Closer To Cloud Computing 'Storefront'

By J. Nicholas Hoover / Information_Week / 7月15日, 2009 10:30 HM

Using the GSA Storefront, federal agencies would choose infrastructure, Web applications, or other IT services to begin a streamlined procurement process.

Federal CIO Vivek Kundra began outlining in greater detail Wednesday an online store where government agencies will soon be able to procure cloud computing services with a few clicks of a mouse. He showed off a mockup of the site and described some of its features.

Kundra wouldn't say when the store, which his PowerPoint? presentation referred to as the GSA Storefront, will launch, but the mockup is evidence that the government is inching closer.

Using the GSA Storefront, agencies would identify the services they need -- be they infrastructure, Web applications, or otherwise -- from a series of menus, and pay either by credit card or requisition. In the background, the government would have already taken care of the complex security and documentation that agencies sometimes take months to carry out themselves before procuring and deploying a technology.

"Agencies will be able to say, 'I'm interested in buying a specific technology,' and we will abstract all the complexities for the agencies," Kundra said Wednesday in a keynote address to a government cloud computing symposium at the National Defense University's Information Resources Management College. "Part of the reason we haven't been able to move as a government leveraging these services is that it's too hard, we make it too complex."

The GSA Storefront will look similar to a typical online store, with familiar elements such as a shopping cart, order history, and user profiles. The mockup breaks down services into software-as-a-service, infrastructure-as-a-service, platforms--as-a-service and citizen engagement services. Under each heading would be a series of services, such asSalesforce (NYSE: CRM) under SaaS or Facebook under citizen engagement.

The Obama administration dedicated a sidebar to its 2010 budget proposal to cloud computing, and Kundra has been a strong proponent of its use as a way to cut costs and more quickly and efficiently procure and manage IT resources since his time as CTO of the city of Washington, D.C.

He has made cloud computing one of his key initiatives for the federal government, setting up a cloud working group of federal CIOs, led by GSA CIO Casey Coleman, to explore options for accelerating the push of cloud computing into government.

Private industry will likely play a role in this push as well. Google Enterprise president David Girouard said in a separate keynote address that Google is working on FISMA certification and accreditation for its services, due by the end of the year. That could potentially be used as a launch point to get Google Apps, for example, onto the GSA Storefront.

Information_Week has published an in-depth report on leading-edge government IT -- and how the technology involved may end up inside your business. Download the report here (registration required).

Cloud Computing: 10 Questions For Federal CIO Vivek Kundra

By J. Nicholas Hoover / Information_Week / 6月17日, 2009 04:00 HM

Kundra discuss his thoughts about cloud computing in government, and what it would take to make cloud technologies easier to adopt in the federal space.

Federal CIO Vivek Kundra is well known for innovative approaches to government IT. He introduced Google Apps to the city of Washington, D.C. when he was its CTO of back in 2007.

He's brought with him to the federal government a philosophy that cloud computing could save money, facilitate faster procurement and deployment of technologies, and allow government agencies to concentrate more on strategic IT projects.

Information_Week sat down with him at his office last week to discuss his thoughts about cloud computing in government, and what it would take to make cloud technologies easier to adopt in the federal space.

Information_Week: The President's 2010 budget request mentions cloud computing pilot projects. Give us a bit more context around those.

Federal CIO Vivek Kundra: One of the biggest challenges has been that agencies and departments have thought and spent money vertically, yet technology is most effective when it's implemented horizontally in terms of fundamentally transforming business processes.

The other challenge that it used to be people would come to work and have access to the greatest technologies, because the government and the corporate environment [were] investing in technologies that were leading edge. Now, we've been left behind and we're seeing massive innovation happening in the consumer space.

Part of that budget in cloud computing is to leverage platforms that are free [and] make sure we look at cloud computing in terms of platforms, which can be deployed horizontally across the government.

The cloud computing investment in the 2010 budget reflects the administration's desire to drive down costs, drive innovation across the federal government, [and] make sure we're making available technologies to the workforce that may be available to them elsewhere.

Information_Week: Are you setting aside money in the 2010 federal budget for this?

Kundra: If you looked at some of the things the GSA [General Services Administration] has come out with, whether the RFI or the summit on cloud computing, those are ideas being fleshed out as we speak. As we get closer to October, project plans and specific funding out of $33 million dollars [in the budget request] will be determined.

Information_Week: What are these pilots and what are you trying to do with them?

Kundra: We're trying to turn this into a scaling problem, and we're trying to make sure that we're looking at the lowest common denominator across all agencies, very simple tools like blogs, and video, and lightweight workflow platforms or public participation platforms.

(Page 2 of 3) 6 17, 2009 04:00 HM

Kundra [cont.]: The key is to make that available to the federal government in a way that's easy and handles security up front in that you bake security requirements into the architecture. The goal is to make it as simple as, if you in your personal life wanted to sign up for an e-mail or photo-sharing platform or storage online, from the time you submit your credit card, the service will be provisioned real-time.

Why can't we do that with all agencies? Why can't we make sure we have infrastructure and platforms that are FISMA-certified [Federal Information Security Management Act of 2002] up front? Why can't we make sure procurement processes have been followed and provisioning issues around making sure we can provision these platforms real-time are addressed.

We're moving from this notion of 'here's a schedule' to the notion of 'here's a platform that can be provisioned real-time.' How do we provide those services? That's what we're working on. But it's about moving away from having schedules and this idea of GSA as an entity that has a bunch of schedules, but there's no center of gravity when it comes to information technology across the federal government. This allows us to create a center of gravity.

Information_Week: So are you working with GSA closely to help them become that center of gravity?

Kundra: We're working very closely with GSA to be that center of gravity.

Information_Week: What does that mean? What do you have to do with them, develop a new way for dealing with IT?

Kundra: That means one, looking at policy issues around information security. FISMA is a perfect example. Today, every agency has to get their own certification and accreditation even if they are using the same set of technologies. Imagine how much money we could save if we were able to have a central place where you could get certification and inherit those rights.

Second is actually creating a storefront that will be agency-facing, that agencies could, with the same ease that consumers do it, provision services.

Third is the underlying technology and rolling out platforms, making sure those platforms are scalable and elastic, so as agencies want to invest in technologies, they're able to do that and scale rapidly, rather than spending money on contracts where you're provisioning something where you're using only 10% of capacity.

Information_Week: What about the internal cloud-like technologies? The Defense Information Systems Agency is building its own private cloud, for example. Does there need to be a centralized platform in government?

Kundra: The approach will be one that recognizes different requirements for various agencies. If the FAA needs to focus on a goal such as air traffic control across the country, that's very specific, versus if they're thinking about a platform for collaboration, which is a lowest common denominator. What is common across all different agencies?

Internally, we will be developing and launching a common platform so GSA becomes a center for providing these services, so you're not going around building platforms and replicating it in other data centers.

At the same time, we need to make sure we focus on open architecture. Open architecture is vital for information sharing, for collaboration down the line.

(Page 3 of 3) 6 17, 2009 04:00 HM

Information_Week: Does GSA in some ways need to transform itself to a shared services organization, and will GSA then be managing data centers under that plan?

Kundra: That's part of that equation, and I'm looking forward to working with the new administrator to look at what GSA needs to do.

Information_Week: Do you agree with criticism that all of IT in government has to be procured just as if the government was buying an aircraft carrier, and that's not the way we should be doing IT?

Kundra: If you look at the processes, if it takes 18 months to 2 years to go through procurement, you've already missed one revolution, one cycle. If we can abstract the procurement processes, security, architecture, provisioning and they're more focused on leveraging the service that's being provided rather than rolling out technology initiatives, that's where we want to move the federal government toward, and this is going to take time. It's going to take major changes when it comes to public policy and procurement and so forth.

Information_Week: How long do you see this taking? Do you see this as a three to five year thing to get these processes down right, and are you running into resistance?

Kundra: My first month when I was looking for people to lead these transformational efforts, cloud computing had the most people in terms of the working group. CIOs would like nothing more than to be able to provision technology. Their view is that if we can take care of provisioning, they can provide services rapidly to their customers. The challenges we're going to face are around security, privacy, contracting and there are many other challenges as we move forward. We're working very closely with the procurement community, the privacy community and security community. It will take time, but we're going to begin by rolling out a set of services in the coming months and scale from there.

Information_Week: When you say a set of services what would you start with?

Kundra: That's forthcoming. The idea there is to start thinking of government as a platform.

Information_Week has published an in-depth report on cloud storage. Download the report here (registration required).

General Services Administration's CIO Looks To The Cloud

By J. Nicholas Hoover / Information_Week / 6月12日, 2009 11:30 HM

Casey Coleman reveals the GSA's role in driving a government-wide cloud computing initiative and other IT priorities.

The U.S. General Services Administration finds itself in the midst of a number of major shifts in terms of delivering information technology, one of the foremost of which is cloud computing.

Meanwhile, the GSA also has some IT initiatives of its own underway, like energy efficiency and telework. Information_Week recently chatted with GSA CIO Casey Coleman about GSA's role in driving a government-wide cloud computing initiative and about GSA's own IT priorities.

Information_Week: What role can GSA play in helping the government accelerate its use of cloud computing and in helping to drive the government's cloud initiatives?

Coleman: GSA is a partner with industry providers to put them under contract with terms and conditions that can be tapped into by other agencies.

Information_Week: Does that just mean driving schedule changes, does that mean helping those vendors put in bids for contracts, or does it mean something more?

Coleman: It certainly means those items you mentioned. One of the things that appears to be of interest is for GSA to take some of the friction out of the processes that agencies must follow and speed up their ability to obtain solutions from GSA. So, for example, let's say have a Web site and you don't have ready-made capacity, so you need infrastructure as a service, you would typically have to prepare a statement of work and go through some kind of source selection, do an acquisition, make an award, conduct a FISMA [Federal Information Security Management Act of 2002] certification and accreditation, and continually monitor that environment, and you would have to replicate those functions every time more Web hosting is needed. If GSA were to provide infrastructure as a service and we pre-compete the vendors so that those vendors can compete only on price or on customer satisfaction for past service and if we do this business certification and accreditation, then an agency in minutes or hours could have hosting capabilities available.

Information_Week: Is there room for the GSA to be more of a shared services environment?

Coleman: We do some of that. We have shared services such as the HSPD-12 [Homeland Security Presidential Directive] smart card badges. Agencies don't have to re-do the technical assessments and so forth. I think that's an area where GSA could do more and it will depend on the goals of the administration.

Information_Week: So there may, with that caveat, be a role there for cloud computing?

Coleman: I think that is an area where we could deliver. That's a leadership decision as to whether that's appropriate, but it's within the boundaries of what we can and have done in the past.

Information_Week: What are you guys doing in terms of energy efficiency?

Coleman: What are we not doing? GSA's right in the middle of it. Two years ago, we consolidated all of our agency infrastructure, called GITGO, GSA IT Global Operations. We have turned off almost 500 servers that, through consolidation and virtualization were no longer needed.

GSA has a very robust telework program. We have over 40% of our workers teleworking and we have issued over 9,000 new laptops to enable our employees to work remotely. Those machines are over 25% more efficient than the ones that we replaced.

We've committed to eco-friendly recycling through our provider Intelligent Decisions. The ones that we didn't recycle we donate to schools, so we're trying to minimize the impact on landfills through recycling and reuse of equipment.

(Page 2 of 2) 6 12, 2009 11:30 HM

Information_Week: One of the things that comes up repeatedly in your strategic plan is the existence of silos, stovepiped systems, and decoupled services. What are you doing to eliminate or consolidate some of those, and where are the biggest pain points that you'd like to see closed up?

Coleman: The next step to eliminating stovepipes is really identity and access management. Rather than each application and each system having its own user ID and password, we'll be using our managed smart cards for two-factor authentication so we can reduce the number of user IDs and passwords, improve security, improve our audit trail so we know when access was granted and when it was removed for those employees and contractors using our systems.

Information_Week: What are the longer-term things you'd like to do?

Data center consolidation is one. I'd like to move toward voice over IP, toward unified communications and softphones to better support our telework initiative and provide more unified capabilities on the desktop.

Information_Week:, that runs on Terremark's cloud infrastructure. What drove that choice to run that there, rather than just in a traditional facility?

Coleman: They had a lot of areas where they wanted to do more innovation, but their appropriation was at a certain level and there wasn't going to be a lot more money coming. It was time to renegotiate their hosting contract anyway. They put it up for re-competition, but wrote the statement of work to try to capture some of the developments in cloud computing and capture some of the cost efficiencies that industry now can offer. I believe they were able to save something like 80 to 90% of what previously they had been spending.

Information_Week: What are you guys doing to be more transparent and open within OCIO [Office of the Chief Information Officer] and to the public?

Coleman: We publish a quarterly report on all of our service levels, our costs and the status of development initiatives, and I'd like to take that further and put that on our Intranet site so any employee can see when we're going to move to the next version of the e-mail client, for example. We've really stepped up our transparency internally, and I think that's as important as transparency to the public because of GSA's mission as a provider of business capabilities to the federal government.

That's not to say we don't have a public mission. We're participating in and providing all the stimulus spending that we are doing through that mechanism, and all of our contract and procurement data is going to be through There's a lot of room for improvement in terms of making the data more useful, but we do try to make it available.

Information_Week: What are you doing to accelerate your telework initiative?

Coleman: Longer term, we'd like to allow an open network where any device can be used on our network if properly authenticated, and that would be helpful in the event that power went out and people were told not to report to work but that wasn't foreseen and they didn't take laptops with them. We are deploying an enterprise version ofCitrix (NSDQ: CTXS) to allow just that.

Information_Week has published an in-depth report on private cloud computing. Download the report here (registration required).

SA Backs Away From Federal Cloud CTO Appointment

By J. Nicholas Hoover / Information_Week / 6月9日, 2009 04:50 H?

Cloud computing is still a major initiative of federal CIO Vivek Kundra, and its importance was even outlined in an addendum to the president's 2010 budget last month.

Less than a month and a half after coming out as federal cloud CTO, Patrick Stingley has returned to his role as CTO of the Bureau of Land Management, with the General Services Administration saying the creation of the new role came too early.

"It just wasn't the right time to have any formalized roles and responsibilities because this is still kind of in the analysis stage," GSA CIO Casey Coleman said in an interview today. "Once it becomes an ongoing initiative, it might be a suitable time to look at roles such as a federal cloud CTO, but it's just a little premature."

Cloud computing is a major initiative of federal CIO Vivek Kundra, and its importance was even outlined in an addendum to the president's 2010 budget last month. Kundra introduced Google Apps to city employees in his former role as CTO of Washington, D.C., and has said that he believes cloud computing could be one way to cut the federal IT budget.

While Stingley is no longer the formal federal cloud CTO, he has by no means turned his attention away from cloud computing. As of last Thursday, he was still scheduled to give a presentation titled "Development Of A Federal Cloud Computing Infrastructure" at the Geospatial Service-Oriented Architecture Best Practices Workshop on Tuesday morning, though as CTO of the BLM, not as a representative of the GSA.

The GSA isn't by any means taking its foot off the accelerator with cloud computing. However, Coleman wants to make sure it's done in the right way. "As we formalize the cloud computing initiative, we will have a program office, we will have a governance model," she said.

Despite the elimination -- for now -- of the federal cloud CTO role, Coleman said that it's "fair to say" that the GSA will be taking a central role in pushing the Obama administration's cloud computing initiative, noting that the GSA should be a "center of gravity" for federal government IT.

The GSA will undoubtedly move to make procurement of cloud services an easier task than it typically is today, both by working with vendors and overhauling procurement processes a bit. Today, "if you have a Web site and you need to host it and don't have a ready-made data center with capacity, you would typically have to prepare a statement of work and go through some kind of source selection, do an acquisition, make an award, conduct a FISMA certification and accreditation, and continually monitor that environment, and basically you would have to replicate those functions every time more Web hosting is needed," Coleman said. "If GSA were to pre-compete the vendors who were capable of providing such a service, and if we do this business certification and accreditation at least at a broad level, then in minutes or hours you have hosting capabilities available to you."

The GSA is experimenting with cloud computing for its own internal use. For example, federal information Web site is hosted via Terremark's Enterprise Cloud infrastructure as a service product, which charges by capacity used. When it was time for renegotiation of its old hosting contract, the GSA opened the contract to bidders and ended up saving between 80% and 90% with Terremark on a multiyear contract worth up to $135 million.

It's also possible that under the Obama administration, the GSA might begin playing more of a shared-services role in IT, as it does in building management. However, Coleman is coy about whether that's likely to happen, saying only that it would depend on the goals of the administration and the incoming GSA administrator. Stingley is reported to have been thinking about how the GSA might build out a federal cloud that agencies could easily tap into.

Attend a Webcast on virtualization and cloud computing. It happens June 16. Find out more and register.

US Federal Government to Offer Cloud Computing Services

Written by Marshall Kirkpatrick / July 29, 2009 9:55 AM / 5 Comments

The US Federal Government has plans to offer both Software as a Service for government agencies and a cloud-based platform for agencies to develop, test and deploy new applications. Those programs could be announced at the Gov 2.0 Summit in September, according to a report this morning from Federal News Radio.

SaaS offerings made available will be government-approved services like email, productivity apps, document management and business process management software. Those services are intended for use by other government agencies. Even more exciting may be the application platform that's part of the plan.

The initiative is reportedly based at the Office of Management and Budget.

The prospect of government agencies using a government cloud platform to build and deploy web based applications like the private sector has used services like Amazon's cloud computing is exciting. By lowering overhead and easing application management, a government cloud could yield a wave of application innovation across agencies. That's the theory at least.

If government is to become the next hot application development sector, it will have to compete with a private tech sector that's already deep into this paradigm and offers developers the possibility of turning cheap web apps into huge riches through acquisition by larger firms.

Also worth watching will be any integration between the government's new cloud platform, data created by the apps deployed on it and the federal site, where an ostensible cross-section of public data is cataloged for subsequent use as development fodder. Creation of a mutually beneficial development ecosystem seems ambitious and promising, but could be far-fetched. Apps on a cloud, contributing data to the data storehouse, so that other developers could pull that data back onto the platform to create new apps and feed new data back into listings? Sounds too good to be true.

For a more in-depth look at the government's cloud agenda, see today's write-up by Jason Miller at Federal News Radio.

The Rise of the Government App Store

Wednesday, July 29, 2009

In a recent post to the CCIF mailing list, Bob Marcus outlined the coming opportunties and challenges facing what he described as "Government Cloud Storefronts". In the post he described Vivek Kundra's (US Federal CIO) vision for the creation of a government Cloud Storefront. This Storefront (run by GSA) which will be launched Sept 9th and will make Cloud resources (IaaS, PaaS, SaaS) available to agencies with in the US federal Government. (an $80+ Billion a year IT organization).

What's also interesting is the US isn't alone in the vision of centralized access points for procuring Cloud services and related applications. Several other governments including the United Kingdom G-Cloud app store and the Japanese Kasumigaseki Cloud are attempting to do the same with Japan spending upwards of 250 million dollars on their initiative.

Kundra, speaking at a recent conference at the National Defense University on cloud computing elaborated on his GovApp? Store concept "Any agency can go online and agencies will be able to say 'I'm interested in buying a specific technology' and we will abstract all the complexities for agencies. They don't have to worry about Federal Information Security Management Act compliance. They don't have to worry about certification and accreditation. And they don't have to worry about how the services are being provisioned. Literally, you'll be able to go in as an agency… and provision those on a real-time basis and that is where the government really needs to move as we look at standardization. This will be the storefront that will be simple."

According to Marcus, "There are strong initial efficiency benefits (reduced procurement time and costs) gained by providing government projects with controlled access to multiple Cloud resources. However unless a set of best practices are followed, there could be negative long-range results such as lack of portability and interoperability across Cloud deployments."

Ed Meagher, former deputy CIO at the Interior and Veterans Affairs departments also sheds some light on the topic saying, "The challenge will be working in both worlds and making those two worlds work together. There's going to be lot of pressure on the [federal] CIO community to help this administration do the things it wants to do, like making government more efficient, more accessible to citizens and more transparent."

I could not agree more. But I also don't think the US Federal GovApp? store requires standardization so much as transparency into the underliying processes that support the so called "running" of the app store.

Some thoughts that come to mind include, who exactly is building this app store, how will it be managed, what oversight will it have and how can we prevent abuse (halliburton style contracts anyone?) or even Apple's Iphone app store style "vendor lockout". These are much more important questions that need to be addressed first.

To help solve these issues on September 21, the Network Centric Operations Industry Consortium (NCOIC) will host an open free Session on "Best Practices for Cloud Storefronts" at its Virginia Plenary. The focus will be on recommended minimal standardizations (and compliance tests) for Cloud resources that are included in Storefront. Government IT leaders (e.g. GSA) will be invited to participate in the Session.

Labels: app store, Cloud Computing, Federal government of the United States, National Defense University, Network Centric Operations Industry Consortium, Vivek Kundra

U.K. Government to Create Country Wide Cloud Infrastructure

Interesting developments from our friends in the UK today. The British government's newly appointed chief information officer John Suffolk has been be given new powers to sign-off all major IT projects with a particular focus placed on the creation of country wide Cloud Computing infrastructure. Details of the new strategy were released as part of a Digital Britain report published earlier, which includes the development of "G-cloud" - a government-wide cloud computing network.

The report highlights the development of a virtual Public Service Network (PSN) with "common design, standards, service level agreements, security and governance." which goes on to outlined that "all those government bodies likely to procure ICT services should look to do so on a scalable, cloud basis such that other public bodies can benefit from the new capability,"

"The Digital Britain report recommends that the government take the necessary steps to secure that the Government CIO has a 'double lock' in terms of accountabilities and sign off for such projects. That will secure government-wide standards and systems" Needless to say, the need for cloud computing standards are now more important then ever with the UK joining a growing list of countries either exploring or activity building government sponsored clouds.

I should note I've been recently involved in similar discussions with the Canadian government regarding a national cloud strategy and platform. For anyone interested, I'm currently organizing a Canadian Federal Cloud Summit for this October in Ottawa. I will also be presenting the keynote at the upcoming Washington Cloud Standards summit on July 13th.

More details on the Digital Britain report are available here

Reuven Cohen CCIF Instigator

A look at Amazon's evolving government cloud strategy

By Eric Engleman on May 22, 2009 at 5:00 AM PDT has targeted its cloud computing business at web startups, large companies, and scientists. But the Seattle online retailer has also been eyeing another potential customer for its cloud: government. The company is quietly building an operation in the Washington, D.C. area, and is aiming to become a key technology provider to federal and state governments and the U.S. military.

The Obama administration's growing interest in cloud computing, and the sheer size of government, make it a compelling market. The federal government market for cloud services is projected to grow to 800 million by 2013, and the state and local cloud market is expected to reach 635 million by that year, according to Input, a government contracting research firm based in Reston, Va.

Of course, Amazon faces competition from a variety of technology companies including Google,, and IBM for government cloud business. Governments themselves are notoriously slow adopters of new technology, and have stringent security and regulatory requirements for their data, which may be a barrier to moving services into the cloud.

But Amazon is clearly positioning itself to work with governments. The online retailer has set up an initiative called "Amazon Government Solutions" that is targeting federal, state and U.S. Defense Department clients, according to job postings on the company's careers website.

Amazon has also hired two former federal government employees who have been turning up at various cloud computing conferences in the Washington, D.C. area. One is CJ Moses, whose online LinkedIn? profile describes him as a former FBI Assistant Section Chief. The other is Andrew Doane, whose LinkedIn? profile indicates he was previously a "technical director" with the U.S. government.

Asked about Amazon's presence in Washington, DC, company spokeswoman Kay Kinton said, "we do have we have employees in the D.C. area focused on a variety of efforts with government agencies being just one of many."

Kinton declined to name any government customers, but said, "Governments have the same requirements as other organizations in the private sector - things like availability, security, scalability, and low cost - all of which we're very focused on."

Amazon did establish a relationship with one government client that may pay dividends in the future. The District of Columbia government lined up Amazon's cloud services to provide extra capacity for the city's websites during the Obama inauguration in January.

The city at the time made only modest use of Amazon's cloud, but the deal provided some useful exposure for Amazon. D.C.'s chief technology officer at the time, Vivek Kundra, went on to become the federal government's chief information officer under Obama, and has been an advocate for increasing use of cloud services by government agencies.

The White House recently released an analysis with its 2010 budget request that calls for a "fundamental reexamination of investments in technology infrastructure" and envisions various government cloud-computing pilot projects. The pilots will produce expected savings "many times the original investment" as agencies reduce their use of data centers, the document says.

Despite the growing buzz in government circles around cloud computing, however, some experts say it's still a long way to becoming the norm.

"There is a tremendous amount of excitement and interest and a lot of people are talking about it, but I don't believe we've hit the state where there's massive adoption," said Peter Mell, a senior computer scientist and cloud computing project leader at the U.S. Commerce Department's National Institute of Standards and Technology.

"It takes time for government to adopt the new technology and use it effectively," said Mell, whose agency studies ways to improve the country's technology infrastructure.

Federal government cloud adoption will triple by 2013, report says

Software-as-a-service major driver of cloud in government

By Jon Brodkin , Network World , 04/30/2009

Government agencies are moving slower on cloud computing than the IT industry as a whole, but federal government spending on the cloud will nearly triple by 2013, a new report says.

"Adoption has been slow; federal and state and local government organizations are gun-shy about migrating capabilities - especially mission-critical capabilities - to 'the cloud,'" market research firm INPUT states in Evolution of the Cloud: The Future of Cloud Computing in Government. "However, the convergence of tight budgets, aggressive market players, and increasing acceptance of the cloud computing model will fuel an uptick in demand for cloud computing." BMC Service Automation Demo: Download now

Federal government use of cloud computing services added up to 277 million in 2008, and will increase steadily until reaching 792 million in 2013, the firm predicts. The 2013 figure will still represent less than 1% of total forecasted government IT spending of $87.8 billion.

INPUT, which conducts market intelligence research for government contractors, is seeing a shift in government attitudes toward technology with the change from the Bush to Obama administrations.

New federal CIO Vivek Kundra is bullish on cloud computing technology, notes INPUT principal analyst Deniece Peterson. Bush's administration did not have a CIO or CTO for the overall federal government, although individual federal agencies did have executives filling those roles, she says.

"This administration seems to be much more technology-savvy than the last one," Peterson says.

The INPUT report divides cloud computing into three general areas: Web-based applications (software-as-a-service), storage and computing (infrastructure-as-a-service), and application development (platform-as-a-service).

Software-as-a-service is driving adoption of cloud computing within government agencies. For example, state and local government spending on software-as-a-service will grow from 170 million in 2008 to 635 million in 2013, according to INPUT.

It's already common for government to use cloud-based e-mail, payroll, Web conferencing and sales applications, Peterson says.

"Software-as-a-service is the driver of cloud computing in government right now, because it seems to be the easiest way to transition into it," she says.