Monthly Archives: May 2014
Posted on May 10, 2014 at 1:38 pm
Nasa’s cloud computing strategy came under fire from US authorities, with concerns raised about major security failings and a lack of communication and organisation.
The report from the US Office of Inspector General (OIG) stated that Nasa’s cloud services “failed to meet key IT security requirements”. It went on to say that of five Nasa contracts for acquiring cloud services, “none came close to meeting recommended best practices for ensuring data security.”
Nasa currently spends $1.5bn annually on IT services, only $10m of which is based in the cloud. However, the agency itself predicts that 75 percent of its future IT programmes will be in the cloud, making the findings of the Office of the Inspector General even more of a cause for concern.
The report went on, listing numerous problems with the way in which the agency failed to meet federal IT security requirements. “We found that the cloud service used to deliver internet content for more than 100 NASA internal and public-facing websites had been operating for more than two years without written authorisation or system security or contingency plans,” it said.
The audit also found that required annual tests of security controls had not been performed, which it said “could result in a serious disruption to Nasa operations”.
Nasa chief executive Larry Sweet joined the agency in June and seemingly has a mountain to climb to reorder his department’s operations, with many decisions seemingly made with his predecessor completely in the dark. “Several Nasa Centers moved Agency systems and data into public clouds without the knowledge or consent of the Agency’s Office of the Chief Information Officer,” the report said.
The reported noted that Sweet agreed with the findings and, with the availability of funds, will work “to improve Nasa’s IT governance and risk-management practices”.
Nasa has long been a supporter of cloud computing projects, lending its backing to the OpenStack open-source cloud project in 2010.
Posted in Cloud Hosting
Posted on May 8, 2014 at 5:32 pm
Security firm Sophos has launched a new service which will allow users to better run run the company’s Unified Threat Management (UTM) platform through the Amazon Web Services (AWS) Elastic Compute (EC2) computing cloud service.
The company said that it would be adding an hourly licence option to its threat management service on the Amazon Market. The option will be available for users to purchase on the AWS Marketplace.
Security services have long been a feature on Amazon’s AWS Market since the company launched the feature in 2012. The market allows third-party vendors to integrate their products with AWS virtual machine instances.
Sophos believes that the new pricing model for the service will allow users to retain security on their servers when running AWS instances for short term projects or relying on the cloud platform’s elasticity to help scale with customer demand during peak operating times.
“As a long-standing security provider, we know about the many benefits that Amazon Web Services provides, especially to SMBs that have adopted the cloud,” said Sophos senior product manager Angelo Comazzetto.
“We pride ourselves on developing complete security offerings that are simple to use, and with this offering, companies can better defend their cloud security resources with layers of security provided by Sophos UTM.”
The company said that the hourly fees will depend on the pricing and region of the AWS instance. Listed prices range from $.02 for a Standard Micro system to $3.10 for a High I/O 4XL EC2 instance.
Posted in Cloud Hosting
Posted on May 6, 2014 at 2:18 pm
Oracle has ended a legal spat with former partner CedarCrestone, in a case that went to court earlier this year.
The case started in September 2012, and saw Oracle allege last year that the firm had stolen intellectual property relating to updates for tax and regulatory software owned by Oracle.
CedarCrestone strongly denied these accusations and said Oracle had engaged in an “unlawful and systematic attack” against third-party support firms.
However, the dispute has now been settled. In a terse statement on Oracle’s website the firm states: “Oracle and CedarCrestone, Inc. announce that they have amicably resolved the litigation between them. The terms of the settlement are confidential.”
The case has echoes of a similar legal spat between Oracle and SAP regarding the theft of code by the German firm’s former partner TomorrowNow.
As well as ending the legal spat, Oracle has also been busy updating its products, with an upgrade to its Database 12c platform by offering wider management support with its Oracle Enterprise Manager platform.
The Database 12c service was announced at the start of July and offers a multi-tenant architecture within the cloud, which the firm said will be key for transitioning the platform to hosted services.
By adding the Enterprise Manager platform the firm said it could further support customers using the service by providing greater IT management.
This includes the ability to “consolidate, clone and manage many databases as one,” and improve IT productivity by reducing the time it takes to perform administrative tasks as well as providing the ability to identify and resolve issues with diagnostics and analysis capabilities.
Sushil Kumar, vice president of Product Strategy and Business Development at Oracle, said that adding the management tools would help customers better mange the platform as its use grows.
“As enterprises continue to implement private clouds, IT management is becoming increasingly complex and costly. Oracle Enterprise Manager 12c is being used by organisations around the world for its broad set of cloud-enabling and management capabilities,” he said.
“By extending Oracle Enterprise Manager 12c to enable managing ‘many as one’, with the new database release, Oracle is making it even easier for customers to significantly reduce IT management costs, avoid critical issues and outages and free up resources for other business-critical tasks.”
Posted in Cloud Hosting
Posted on May 4, 2014 at 5:32 pm
IBM and Pivotal have signed on to advance the Cloud Foundry platform.
The companies said that they would join the effort to build an open-source cloud platform which can be adopted by customers for public and private cloud computing platforms.
IBM said that the effort would allow customers to produce cloud computing deployments without the risk of vendor lock-in, keeping options open.
“Cloud Foundry’s potential to transform business is vast, and steps like the one taken today help open the ecosystem up for greater client innovation,” said IBM next generation platforms general manager Daniel Sabbah.
“IBM will incorporate Cloud Foundry into its open cloud architecture, and put its full support behind Cloud Foundry as an open and collaborative platform for cloud application development, as it has done historically for key technologies such as Linux and OpenStack.”
Launched int 2011 by VMware, Cloud Foundry seeks to provide businesses with a common platform for both public and private cloud networks.
For its part, IBM said that it would be providing its WebSphere platform to Cloud Foundry, including a preview version of the Application Server Liberty Core.
“We believe that the Cloud Foundry platform has the potential to become an extraordinary asset that many players can leverage in an open way to enable a new generation of applications for the cloud,” said Pivotal chief executive Paul Maritz.
“IBM’s considerable investment in Cloud Foundry is already producing great results with application-centric cloud offerings such as making IBM WebSphere Liberty available on Cloud Foundry. We look forward to growing and expanding an open Cloud Foundry community together with IBM.”
Posted in Cloud Hosting
Posted on May 2, 2014 at 4:52 pm
SAN FRANCISCO: Intel has outlined its vision to reshape the data centre with new approaches for compute, storage and network technologies to make data centres more flexible and cost effective, measures that will be needed to meet looming challenges in data volumes and power consumption.
At its data centre event in San Francisco, Intel outlined its strategy, which amounts to creating a kind of reference architecture for data centre operators to follow. It comprises technologies for virtualising the network, making storage smarter, and re-architecting servers at the rack level to deliver a pool of resources that can better meet the requirements of applications.
These changes are needed in order to meet the changing requirements of data centres, driven by factors such as the boom in mobile devices and the success of services such as social media, according to Intel’s senior vice president of the data centre and connected systems group, Diane Bryant.
“If you look at where we are now, today’s infrastructure is strained. It can take weeks to reconfigure the network to support new processes. At the same time, we’ve moved from the traditional structured enterprise data to a world of unstructured data,” she said.
Intel’s solution is to create a blueprint for the software-defined data centre, using automation to enable it to adapt to changing requirements.
Perhaps the most radical part of the vision is Intel’s Rack Scale Architecture (RSA) strategy, which “breaks down the artificial boundary of the server” in order to turn racks into pools of compute, storage and memory that can be used to provide an application with the optimum resources it requires, Bryant said.
Jason Waxman, general manager of Intel’s Cloud Infrastructure group, showed off two server “tray” designs that are a step on the road to delivering this vision, he claimed. One was filled with multiple Atom nodes, with a network mezzanine card at the rear that provides a switched fabric right in the tray, with silicon photonics interconnects to link each tray to a top-of-rack switch.
“Ideally, you want the rack to be completely modular, so you can upgrade each of the subsystems as you require, without having to rip out the whole server,” he said.
The other parts of the data centre blueprint involve virtualising the network, using software-defined networking (SDN) and network function virtualisation (NFV), the latter of which sees network functions such as a firewall or VPN delivered using virtual appliances running on standard servers.
On the storage side, Intel sees a growing role for SSD storage, perhaps integrated into the rack, while less frequently used data is relegated to low-cost disk storage in Atom-based storage nodes.
Intel stressed that its approach was standards-based, saying that the orchestration and management tools to deliver the software defined network vision would be delivered by third parties, such as the OpenStack cloud framework.
However, Intel pushed home the advantages of its x86 architecture chips, in the vast ecosystems of operating systems, applications and services that have built up around it.
“Software consistency is important,” said Waxman. “With other architectures, it’s not just about porting apps, it’s about the supporting database and the middleware,” he added.
Posted in Cloud Hosting
« Previous Page