Cloud Hosting

Rackspace launches Open Cloud Academy training facility

Posted on May 17, 2013 at 11:18 am

Rackspace is opening the doors on a new training facility it hopes will address a shortfall in qualified administrators for cloud computing platforms.

The company said that its Open Cloud Academy would allow IT professionals and students to obtain certification in the management and administration of cloud deployments.

Based in San Antonio, Texas, the training facility will target current professionals as well as students, teaching them cloud management techniques. The facility will also house veterans’ training programmes to give returning soldiers job skills in IT and cloud administration.

Rackspace hopes that the new campus will help to bridge a gap which exists between cloud vendors and the current curriculum taught in universities. With qualified administrators in short supply, the company believes that many firms are unable to properly deploy cloud computing initiatives.

“As cloud adoption increases, companies stand to reduce IT costs and become more agile, however, with not enough people properly trained in these cutting edge technologies, organisations are missing the boat,” said Rackspace chairman Graham Weston.

“The Open Cloud Academy can help turn the tide by offering highly sought after technical training to the public, bolstering the this scarce pipeline and helping fill the countless number of roles in San Antonio and beyond.”

Rackspace is not the only vendor to see a widening gap between emerging platforms and the training programmes offered to students.

Earlier this week government officials warned that in the UK alone some 40,000 more graduates will be needed in order to overcome a skills shortfall in the science and technology sectors.

Vendors have increasingly looked to step in to offer training programmes in areas where universities have fallen short.

EMC has begun offering training programmes and certifications for big data analysts, while IBM has reached out to university computer science departments around the globe to get students versed on its Watson big data platform.

Posted in Cloud Hosting

Google Drive suffers outage

Posted on May 15, 2013 at 2:40 pm

Google’s cloud storage service was hit by a two hour long outage.

Drive was hit by the outage in the early morning. Google’s outage comes on the heels of a similar outage by its Microsoft cloud storage counterpart earlier this month.

“We apologize for the inconvenience and thank you for your patience and continued support,” wrote Google on its Drive status page.

“Please rest assured that system reliability is a top priority at Google, and we are making continuous improvements to make our systems better.”

Google announced that it was investigating a potential outage at 7:17 AM PST. By around 8 AM PST, the firm reported that the issue was being resolved and the service should be back online within the hour.

V3 has contacted Google for clarification on what caused the outage and will update this story when details are released.

Google Drive is the search giants cloud storage service. The product was recently updated to offer users the ability to preview 30 different types of files without opening them directly.

Drive is in direct competition with Microsoft’s cloud storage service SkyDrive. Redmond’s service also was recently hit with an outage. Earlier this month, SkyDrive went down because of an overheating datacentre.

Twitter and Amazon were also taken offline by outages in the early months of 2013. Amazon’s US e-commerce homepage was down for about an hour. Twitter’s outage lasted about three hours. Both outages occurred on the same day in late January.

Posted in Cloud Hosting

Dropbox branches out with Mailbox buy

Posted on May 13, 2013 at 12:05 pm

Online storage specialist Dropbox has announced a deal to acquire email management firm Mailbox.

The company said that the deal, terms of which were not disclosed, will expand the scope of its business to include the popular inbox management tool. The companies said that under the deal all 13 of Mailbox’s employees will remain with the company and the developer will continue to support and develop its flagship mobile application.

In announcing the deal, Mailbox said that it would look to take advantage of Dropbox’s resources to grow its team and support a service which has already generated an extensive wait list for new users.

“Rather than grow Mailbox on our own, we’ve decided to join forces with Dropbox and build it out together. To be clear, Mailbox is not going away,” the company said.

“The product needs to grow fast, and we believe that joining Dropbox is the best way to make that happen.”

For Dropbox, the move gives the company an infusion of talent and an expansion to its current online storage and file-sharing service. The company said that the Mailbox team was a natural fit to its structure.

“After spending time with Gentry, Scott, and the team, it became clear that their calling was the same as ours at Dropbox—to solve life’s hidden problems and reimagine the things we do every day,” the company explained.

“We all quickly realized that together we could save millions of people a lot of pain.”

Posted in Cloud Hosting

Microsoft Outlook and Hotmail outages caused by overheating datacentre

Posted on May 11, 2013 at 7:04 pm

Microsoft has said that outages on its Outlook and Hotmail services were caused by an overheating datacentre that took itself offline to stop problems spreading further.

The incident occurred on Wednesday and left scores of customers without access to their email accounts.

In a blog post written by vice president of test and service engineering, Arthur de Haan, the firm gave more insight into the cause of the issue.

“On 12 March, in one physical region of one of our datacenters, we performed our regular process of updating the firmware on a core part of our physical plant. This is an update that had been done successfully previously, but failed in this specific instance in an unexpected way,” he said.

“This failure resulted in a rapid and substantial temperature spike in the datacenter. This spike was significant enough that it caused our safeguards to come in to place for a large number of servers in this part of the datacenter.”

He explained this meant access to mailboxes housed on the servers was suspended and this also stopped other pieces of infrastructure acting as a failover to allow access.

“Once the safeguards kicked in on these systems, the team was instantly alerted and they immediately began to get to work to restore access,” de Haan continued.

“Based on the failure scenario, there was a mix of infrastructure software and human intervention that was needed to bring the core infrastructure back online.

“Requiring this kind of human intervention is not the norm for our services and added significant time to the restoration.”

De Haan apologised for the outage and said he hoped the explanation would give customers some understanding of the situation that faced the firm.

The incident underlines that issues around cloud computing continue to hit the headlines and may cause some businesses to continue to be wary of the model for hosting business critical applications.

Posted in Cloud Hosting

VMware details network virtualisation and hybrid cloud capabilities

Posted on May 9, 2013 at 12:37 pm

VMware is set to overhaul the way its cloud computing platform handles networking by combining its existing technology with that of Nicira to create a new software-defined networking (SDN) platform.

The firm has also created a new Hybrid Cloud Services business unit and is set to introduce vCloud Hybrid Service, part of a plan to enable VMware customers to seamlessly expand workloads from their datacentre out to a public cloud provider.

Both moves were detailed at a strategic forum, where chief executive Pat Gelsinger outlined VMware’s corporate strategy to an audience of investors.

On the networking side, VMware is aiming to completely virtualise the underlying network by combining its existing vCloud Networking and Security product with the network virtualisation technology it gained from the acquisition of Nicira last year.

The resulting VMware NSX, coming sometime in the second half of 2013, will enable VMware customers to completely abstract network services and connections away from the underlying physical infrastructure, enabling these to change as virtual machines are created or moved around the datacentre.

“The main problem in most datacentre deployments is that we can virtualise servers, so you can spin up a new instance in minutes, but the reality is that [customers] have to deal with their networking team to get it correctly configured in the right place in the network,” VMware’s EMEA chief technologist Joe Baguley told V3.

With many network services such as firewalls already implemented as virtual machines and most traffic in VMware private clouds already handled by virtual switches, NSX will complete the picture, according to Baguley.

“A lot of traffic in datacentres now hardly ever touches the physical infrastructure and is between servers on one host, so physical switches need only connect the physical host machines together and we do everything on top of that,” he said.

Consequently, datacentre infrastructure could almost be implemented as one giant flat network, with all the topology and security management implemented by NSX, according to VMware.

Meanwhile, the Hybrid Cloud Services business unit will work with VMware’s service provider partners to enable customers to take advantage of public cloud services using the same management and orchestration tools as with their private cloud infrastructure.

In practice, this is likely to involve public clouds built around VMware’s vCloud suite and other tools that enterprise customers are already using internally, although VMware has yet to fully detail its plans.

Gartner analyst Chris Wolf warned that the move is essentially a strategy by VMware and its partners to try and restrict user choice regarding on which public cloud services they can “burst” workloads to in order to meet a peak in demand for resources.

“Choice will mean a VMware-hosted offering that in theory will make it easy for customers to move VMware-based workloads in and out of the pubic cloud,” he said in a blog post.

The challenge in making such a move is in ensuring that the workload runs and is managed properly in the new environment, which may be based on an entirely different cloud stack to that used internally.

“This is an opportunity where VMware can leverage its management assets both inside the datacentre and in the public cloud to allow customers to redeploy workloads and not have to worry about the infrastructure or management stack,” Wolf explained.

In other words, VMware and its partners are trying to make it more convenient for customers to choose them rather than use Amazon Web Services (AWS), currently the largest provider of public cloud services.

VMware said its vCloud Hybrid Service will be available later this year via its existing partner ecosystem.

The Hybrid Cloud Services business unit itself is being overseen by Bill Fathers, former president at cloud services and hosting specialist Savvis.

Posted in Cloud Hosting

Top 10 most read: Snooping security cameras, Samsung S4 eye scroll and TI to work on ARM systems

Posted on May 7, 2013 at 8:49 pm

The biggest news of the week for V3 readers was the announcement that Texas Instruments (TI) has teamed up with HP to develop ARM-based chips for low-power servers.

The chips will be the first in the industry to use the ARM Cortex-A15 architecture, and will offer power-efficiency due to their integrated cores for network processing and I/O for severs. TI will no doubt be hoping it gets a boost from the deal, as the firm aims to turn itself around after recently laying off 1,700 staff.

Another story that proved of huge interest to readers was the news that they might be spied on via their IP security cameras. Security researcher Adrian Hayter revealed that he was still able to tap into hundreds of publicly accessible IP camera feeds via a simple spot of Googling and a bit of knowledge about what to look for, a year after the flaw was first detailed. Sony, Panasonic and Trendnet were among those vendors listed as failing to adequately restrict access to camera feeds.

Also on the security front, basic computer Raspberry Pi became the latest victim of a DDoS attack, while Microsoft will release four critical patches on Tuesday, with the most concerning hole relating to remote code execution in Windows 8 and Internet Explorer 10.

Readers were also keen to find out about the government’s new plans for a cloud-first policy, which will see central government departments have to use public cloud services wherever possible.

Texas Instruments climbs aboard HP’s Project Moonshot for low power servers
Chip maker to collaborate on design of ARM-powered systems

 

 

 

Security cameras continue to pose snooping risk
Thousands of camera feeds are publicly available, despite security warnings

 

 

 

UK commits £88m to build world’s largest optical telescope
E-ELT expected to capture images of the first galaxies formed

 

 

 

 

Researchers unearth ‘time bomb’ in Chinese APT
Malware attempts to put researchers off the trail of its command and control servers

 

 

 

Raspberry Pi hit by DDoS attack
Organisation falls victim to online attack from botnet with a million nodes

 

 

 

Microsoft to add four critical fixes to patch deluge
Organisation falls victim to online attack from botnet with a million nodes

 

 

 

UK government to adopt ‘cloud first’ policy to cut IT spending
Central government departments will be mandated to use public cloud services wherever possible

 

 

 

Samsung Galaxy S4 Eye Scroll frenzy proves value of Apple release strategy
Korean firm right to hold back at CES and MWC

 

 

 

Dropbox chief slams Apple’s iCloud
Walled garden is not so pretty, says cloud chief exec

 

 

 

Windows 8 fails to spark business interest, as iPad rules corporate roost
V3 readers decry “awful” user interface, although one fifth have bought new systems

Posted in Cloud Hosting

Apple and Amazon talk up ‘used’ content services

Posted on May 5, 2013 at 6:54 pm

Apple and Amazon have filed patent applications which suggest that the firms are planning to offer secondhand marketplaces for digital content.

According to the US filings, each company is seeking to win claim to technologies which, if granted, would describe a marketplace where users can transfer their rights to digital content to another party or device.

In the filings, both companies describe how users could agree to transfer the digital rights management (DRM) protections on a piece of digital content to another user. Apple’s filing details how the transfer could take place, while Amazon’s describes the structure of a web-based store for digital content.

The described stores would turn the DRM holdings into a transferable commodity, allowing users to buy and sell “secondhand” content by acquiring the DRM from peers in the way users can buy and sell books and recordings in physical marketplaces.

Should the patents be granted and implemented, both companies could add used content services onto their respective online media stores.

Amazon offers both digital book and music content through its online markets, while Apple allows users to purchase digital books through the iBooks store and music and video content through the iTunes store service.

The launch of the services could also trigger a fresh round of legal turmoil for the companies should publishers and studios take exception to a model which could partially or fully cut them out of the revenue loop.

Apple is already in hot water with the US government over its iBooks service. Antitrust officials have argued that the company’s revenue model, which allows publishers to set their own prices, unfairly forces other retailers to adjust their own prices based on publisher demands.

Posted in Cloud Hosting

‘Used’ media markets could hinge on publisher reactions

Posted on May 3, 2013 at 9:13 am

A recently-uncovered pair of patent filings from two to the industry’s largest music and video retailers could trigger a seismic shift in the way we buy and sell music online.

Both Apple and Amazon have filed for patents on secondhand digital media services. In essence, the stores would allow users to buy and sell the DRM rights for content, transferring those rights to another device and removing them from their own.

The benefit for users is obvious, customers could save costs by buying used and recoup some of their purchases by reselling. The stores, meanwhile, are able to collect a higher margin on transactions of used items (though that could change with the revenue models Apple and Amazon use for their stores.)

The publishers and studios, however, are not as fond of used sales, for obvious reasons. As they have already sold the media once, they are unable to get a second cut of sales and lose money when customers forgo new content for used copies.

As such, studios, labels and publishers come up with a variety of ways to keep users buying new content. Be it with new formats, special edition releases of popular titles and other methods, they’ve sought to minimise the value of used content.

Any university student who has had to purchase textbooks knows of these tricks. Publishers will come out with new editions of textbooks every year in order to thwart the market for used books. Often these new editions don’s add much content, only enough to change page numbers and layouts enough to alter a course syllabus and lesson plan to the point where professors and bookstores have no choice but to abandon the old editions which are available used.

Will there be a similar reaction should Amazon and Apple succeed in opening their “used” digital content services Will the publishers seek to strike a royalty model which gives them another cut of the resale profits, or will they simply take to the courts to block these services from ever seeing the light of day

09 Mar 2013

Posted in Cloud Hosting

Rackspace updates Private Cloud Software with orchestration tools

Posted on May 1, 2013 at 4:50 pm

Rackspace is continuing its cloud computing momentum this year with an update to its Private Cloud Software, adding a combined user interface and orchestration to the free-to-download platform.

The Rackspace Private Cloud Software, first launched last year, is a version of the OpenStack cloud framework, packaged up by Rackspace and offered as an open-source solution for organisations to get started with a private cloud, hosted in their own datacentre, by Rackspace or another provider.

This new version updates the OpenStack components to those in the most current “Fulsom” release, and also broadens the operating system support, allowing users to choose Ubuntu, Red Hat Enterprise Linux or CentOS as the host platform.

However, the major new feature is OpenCenter, a tool that provides a graphical user interface for provisioning of the cloud infrastructure plus deployment and management of cloud services to run on it.

“Rackspace Private Cloud’s powerful new OpenCenter platform is a user interface and orchestration tool built to deploy, operate and scale on-premise private clouds,” said Jim Curry, Rackspace’s general manager of the Private Cloud business.

OpenCenter is also open-sourced under the Apache 2.0 licence, and provides administrators with the ability to deploy the OpenStack controller and compute services across the infrastructure, as well as delivering high availability support.

Although the Private Cloud Software is free to download, Rackspace makes money by providing technical support and services to customers.

The move follows a busy start to 2013 for Rackspace, which has already announced plans to double its datacentre capacity in the UK and acquired cloud services firm ObjectRocket for its MongoDB NoSQL cloud database technology.

You can hear more about Rackspace’s OpenCloud and OpenStack technology, and how it worked with Nasa to develop it, below:

Posted in Cloud Hosting

Cloud computing makes 16GB smartphones the ideal size

Posted on April 29, 2013 at 12:52 pm

According to analysts, your average low-end smartphone now has more than enough storage for most user.

A study from IHS found that on average, customers who purchased a smartphone in 2012 only needed about 12.8GB of storage to meet their needs.

The report suggests that users are now making such heavy use of cloud-based services that local storage has become something of an afterthought. As a result, we have less data to store on our phones and the concept of regular storage increases is less of a selling point than it once was.

If it holds up, the phenomena could change the way we look at mobile device features. With storage a secondary concern vendors could save money on storage costs and invest in other hardware features, such as improved batteries and more efficient antenna components.

As a result, traditional bottlenecks could be alleviated and the overall handset experience could improve.

On the other hand, there is a very definite loser here. Chipmakers stand to take a huge hit if their largest market decides to flatten out its demand for NAND storage chips, potentially putting a major crunch on the market.

Meanwhile, cloud vendors face a pressing concern of their own as more data is being uploaded to the cloud, creating the need for more storage, security and management tools.

As we so often see, advances in the market are far more complex than they first seem and every new technology has an impact on other parts of the ecosystem.

06 Mar 2013

Posted in Cloud Hosting

« Previous PageNext Page »