Posted on January 12, 2014 at 5:30 pm
SAN JOSE: Dell claims the public cloud is a “race to the bottom” and told V3 that it pulled the plug on its public cloud offering because it did not want to take on the “800lb gorillas” currently in the market.
Dell’s decision to exit the public cloud market came as a shock as rival vendors are pushing hard to get a piece of the market that is currently dominated by Amazon Web Services. However, Sam Greenblatt, chief architect of Dell’s Enterprise Solutions Group told V3 that the firm was not willing to put up the resources to fight the current industry incumbents.
Greenblatt said: “There are several 800lb gorillas in the public cloud market and we are not one of them. Building that market, you really have to understand in detail how you are going to be the low-cost provider and at the same time deliver the best value. Some people call it a race to the bottom in the public cloud.”
According to Greenblatt, Dell’s internal research said more than 90 percent of its customers were interested in private clouds. “Of the 90 percent, 75 percent is actually doing virtualisation, not cloud. People say cloud, but they are usually managed hosting providers, they are not really doing the cloud market. So we decided we were not going to invest heavily in public and we were going let the 800lb gorillas slug it out in that market, but we are not giving the private cloud business, or Openstack.”
Greenblatt was keen to state that the firm was fully behind the Openstack project but did have some sobering words about the hype surrounding the cloud. “It’s a transport mechanism and a computing architecture, it’s not something magical that you sprinkle fairy dust on and all your problems go away,” he said.
Greenblatt also said Dell’s decision to get out of the public cloud market had meant the firm let a number of contractors go, though he didn’t have exact numbers. Instead Greenblatt said Dell had a lot of contractors, which suggests it was at least partly a cost-cutting measure.
Given Dell’s internal research, perhaps it is not surprising that the firm didn’t want to take on the likes of Amazon, Google and Microsoft in the public cloud market. Although Greenblatt didn’t refer to the two companies by name and said margins were not the primary reason behind its decision to abandon the public cloud, it is becoming clear that new vendors will find it increasingly hard to mount a challenge to public cloud incumbents.
Posted in Cloud Hosting
Posted on January 10, 2014 at 5:20 pm
Windows Server 2012 R2 is a comprehensive refresh of Microsoft’s server platform, with advances in storage, networking, Hyper-V and across the board, according to the firm.
However, some features stand out as “game changing”, according to Jeff Woolsey, principal programme manager for Windows Server Virtualisation. These include storage tiering in software, and a multi-tenant gateway to support software defined networking (SDN) in cloud deployments.
Storage tiering is an update to the Storage Spaces feature of Windows Server 2012. It creates a pool of storage from a bunch of disks directly attached to the server, with thin provisioning and resiliency provided by the file system.
In the upcoming R2 release, customers can now tier that storage using a combination of SSD and spinning disks, delivering a dramatic boost in I/O performance.
“We’re taking mainstream SSDs, applying them to hard disks, and giving you phenomenal performance,” Woolsey told V3.
In a demo at the TechEd conference, Woolsey showed how server with just spinning disks achieved 7400 input/output operations per second (IOPS). The same task with four SSDs added for tiering delivered 124,000 IOPs – a 16x performance improvement.
“Now you can set up a scale-out file server with JBOD storage and JBOD SSD, and deliver the same performance, resilience and fault-tolerance as a SAN at a fraction of the cost,” Woolsey said.
R2 also supports deduplication for active virtual machines, which will enable customers to slash the costs of storage to support virtual desktop infrastructure (VDI) deployments.
“This has been one of the blockers to VDI – when customers actually see the cost of the storage to implement it, it just doesn’t make business sense,” said Woolsey.
While the dedupe is processed in software, it does not significantly affect performance, he said, as “servers are never compute bound, as most of the time they are waiting around for I/O and storage.”
Meanwhile, the multi-tenant gateway extends the network virtualisation features introduced in Window Server 2012 to allow service providers to better support multiple customers in their cloud infrastructure.
“Customers want to be able to bring their network to that cloud, and to do that you need a gateway. Today, there some hardware gateways, but you have to buy the right one, and so we just provide that in software under R2,” Woolsey said.
System Center is the control plane to create and manage network virtualisation and the data plane lives in Windows Server, he explained. The R2 release also extends Microsoft’s PowerShell automation framework, turning it into a “fundamental building block for operating the cloud,” according to Woolsey.
“If you are an IT pro, you have to have PowerShell on your resume today. You have to,” he said.
Windows Server 2012 R2 will be available as a preview release later this month and set to ship commercially later this year.
05 Jun 2013
Posted in Cloud Hosting
Posted on January 8, 2014 at 1:31 pm
Salesforce has agreed to acquire cloud marketing firm ExactTarget for $2.5bn, a 53 percent premium on the firm’s pre-deal trading price.
Salesforce intends to use the acquisition to bolster its digital marketing capabilities, as the cloud pioneer recognises the burgeoning IT spending clout that is being wielded outside the traditional IT department.
Marc Benioff, chief executive at Salesforce.com, said: “The [chief marketing officer] is expected to spend more on technology than the CIO by 2017. The addition of ExactTarget makes Salesforce the starting place for every company and puts Salesforce.com in the pole position to capture this opportunity.”
Salesforce has already invested heavily in is marketing offerings, paying $276m for social media monitoring firm Radian 6 in 2011 and $689m for Buddy Media in 2012. But Salesforce clearly felt it had further work to do in strengthening its arsenal, given those previous deals and the premium it was willing to pay for ExactTarget.
ExactTarget’s 6,000-plus client list, which includes the likes of Coca-Cola, Nike and Gap, may have added to its allure, wrote Angela Eager, an analyst with TechMarketView on a company blog.
“Salesforce.com is putting lot of faith as well as a lot of money into ExactTarget. It will substantially expand Salesforce’s Marketing Cloud, which has good coverage of social channels thanks to previous Buddy Media and Radian6 acquisitions,” she said. “Salesforce.com has momentum but as this deal indicates it is having to invest more extensively to maintain or increase the pace.”
The deal, which is expected to close by 31 July, comes hot on the heels of IBM splashing out $2bn on cloud firm SoftLayer Technologies.
Posted in Cloud Hosting
Posted on January 6, 2014 at 7:38 am
IBM has announced the purchase of public cloud computing infrastructure firm SoftLayer in a sizeable boost to its cloud offerings to enterprises.
Terms of the deal were not disclosed, but figures around the $2bn mark were floated by numerous sources cited online. IBM said the deal was designed to help it boost its mix of cloud computing offerings so it could meet the needs of all firms, especially those in looking for public cloud infrastructure.
Ric Telford, IBM vice president for Cloud Services, told V3 that the deal would help make the firm a ‘one-stop shop’ for cloud services as the demand for public cloud services from enterprises increases.
“One cloud does not fit all and there is no one approach to the cloud, it’s dependent on workloads, or the applications you want to deploy as to whether you want a private, public or hosted environment,” he said.
“That’s what intrigued us about SoftLayer to round out our portfolio of cloud offerings. They have all three models but one management layer, so we can broaden our existing portfolio and meet the demands of customers.”
Telford said that in recent years IBM has seen a growing number of enterprise customers show a willingness to operate in public cloud environments.
“In the early years most deployments were private cloud but not we’re seeing many, firms are more comfortable dealing with public cloud offerings around software as a service and platforms,” he added.
The SoftLayer offerings will be incorporated into a new business unit within the Global Technology Service Business, and be offered alongside IBM’s existing products in its SmartCloud portfolio so the firm can meet any firm’s cloud needs.
“Ultimately this gives us the breadth of flexibility. We know firms like other vendors offerings, but they don’t have the breadth of options that they can get with IBM,” added Telford.
“So you may want to run some applications in the private cloud, some in the public and have the ability to move them back and forth as you need and so now you can do this with our portfolio.”
Dallas-based SoftLayer has around 21,000 customers and owns 13 data centres in locations across the US, Asia and Europe, which will allow IBM to meet the needs of those working within the restrictions of data privacy laws.
“SoftLayer has a strong track record with born-on-the-cloud companies, and our move today with IBM will rapidly expand that footprint globally as well as allow us to go deep into the large enterprise market,” said Lance Crosby, chief executive of SoftLayer.
The use of public cloud computing services is growing rapidly, with firms like Amazon Web Services hosting several notable firms within their infrastructure, such as digital streaming firm Netflix. However, the perils of public cloud with regards to outages have been shown on several occassions.
Posted in Cloud Hosting
Posted on January 4, 2014 at 3:51 pm
The erstwhile head of the government’s G-Cloud service, Denise McDonagh, has confirmed her departure from the programme, which is moving to the auspices of the Government Digital Services (GDS) group.
McDonagh, who took charge of the G-Cloud programme last April, said the service would “forever change the way we commission and use IT in the public sector”.
“I can now hand over G-Cloud to GDS, safe in the knowledge that we have started such a groundswell of support and momentum for change that G-Cloud is here to stay,” she wrote on a G-Cloud blog.
“This has been the most enjoyable roller-coaster ride ever.”
G-Cloud has become a critical part of the government’s IT strategy, and is touted as the best way of procuring IT services cheaply, without getting tied into multi-year, multi-million pound contracts. It has also been heralded as the best way to open up public sector contracts to small and medium IT suppliers.
The service was the brainchild of Chris Chant, who spent more than three decades wrangling with the labyrinthine nature of Whitehall IT. Chant took to Twitter on Tuesday to sing the praises of the G-Cloud team.
Nonetheless, despite the upbeat tone to McDonagh’s announcement, there is plenty of work to be done before G-Cloud comes anywhere close to attaining its goal of changing public sector IT procurement. According to McDonagh, as of April 2013, a paltry £22m had been spent via G-Cloud, a drop in the ocean of government IT spend.
Earlier this week, Cabinet Office minister Francis Maude labelled the service “under used”.
McDonagh will revert to her role as IT director at the Home Office, a position she held in addition to overseeing G-Cloud.
Posted in Cloud Hosting
Posted on January 2, 2014 at 7:45 pm
Spend on the government’s under-pressure G-Cloud service has now hit £22m after several sizeable deals were signed off in April, including a spend of £1.3m with IBM by the Home Office.
The figure marks an increase of around £4m from the £18m that had been spent on the service by March and comes amid pressure on the platform to demonstrate more value for money, having failed to really ignite interest in the public sector since its launch a year ago.
This was driven by deals such as those won by IBM, as well notable others such as a deal worth £68,000 signed off by the Cabinet Office with Steria and a deal worth £205,000 signed off by the Ministry of Justice with system integrator firm i2N.
V3 contacted the Home Office for details of the services it purchased from IBM but had received no reply at the time of publication. IBM had also not responded.
While the increase proves the public sector is still interested in the platform, the number is a tiny amount when compared to the overall spend of government IT and won’t do much to relieve the pressure on the service.
On Monday Cabinet Office minister Francis Maude admitted the service was ‘underused’ and the head of the service Denise McDonaugh announced she was stepping down from managing the G-Cloud as it moves to the Government Digital Services division.
Posted in Cloud Hosting
Posted on December 31, 2013 at 8:47 pm
New Orleans: Microsoft has announced updates across a broad range of its enterprise products and services aimed at helping IT departments move forward and with cloud computing and the new era of mobile devices.
At its TechEd 2013 conference, Microsoft announced Windows Server 2012 R2, System Center 2012 R2, SQL Server 2014 and updates to Windows Intune all coming later this year. The firm also showed off some of the changes coming in the updated Windows 8.1 platform.
Microsoft’s theme for the event was that the Windows platform and services around it continue to be best placed to meet the needs of enterprise customers, even as those customers look to embrace cloud computing and the brave new world of mobility.
“Microsoft’s vision of the cloud is the cloud OS,” said Brad Anderson, corporate vice president of Microsoft’s Server and Tools division. “We have promise to empower IT , enable modern business apps, unlock insights into data, and to transform the datacentre.”
In addition, there was the familiar refrain that Microsoft’s platforms are better together. System Center and Windows Server are more effective when combined, Azure is the best cloud platform for a Windows-based hybrid cloud strategy, and Windows Intune enables customers to extend System Center’s reach out to mobile devices.
The upcoming releases are all “Significant updates to the versions we released last year,” Anderson claimed.
For example, Windows Server 2012 R2 now supports automated storage tiering using SSD and spinning disks in its Storage Spaces disk pooling feature, delivering a claimed 16x performance boost.
When used with System Center 2012 R2, customers can also live migrate virtual machines between different versions of Windows Server, with support for compression and deduplication speeding the process, according to Microsoft.
Windows Server 2012 R2 also includes a new feature called Workplace Join to help address the bring-your-own-device trend. This enables users to use a device of their choosing to connect to corporate resources, while allowing IT to apply policies to control how it is used.
SQL Server 2014 now enables in-memory analytics capabilities, delivering more powerful transaction processing in real-time, according to Microsoft.
For Windows 8.1, Microsoft confirmed that the new platform will enable users of legacy applications to boot straight to the desktop environment, as well as user interface changes, support for wireless printing and wireless display streaming.
“With Windows 8.1, we are continuing our vision to make sure Windows 8 tablets are best business tablets, as well as delivering a better experience for users on devices such as laptops,” said Erwin Visser, senior director for Microsoft’s Windows Division.
Windows Server 2012 R2, System Center 2012 R2 and SQL Server 2014 will be available in preview later this month, Microsoft said. The first two are set to ship by the end of the year, with SQL Server 2014 following early next year.
Posted in Cloud Hosting
Posted on December 29, 2013 at 7:25 pm
Apple is reportedly securing a deal which would bring a major contributor to its upcoming streaming music service.
The Wall Street Journal cited sources familiar with the matter in reporting that the company has inked a deal with Warner Music to bring the label’s content to a “radio” channel streaming service from Apple. The deal reportedly includes an agreement to pay Warner 10 percent of the service’s advertising revenues.
The rate could help lure labels towards iTunes and away from would-be rival Pandora, which is said to pay labels a smaller percentage of ad revenues.
If true, the report would lend further credibility to the belief that Apple would soon be launching its streaming music service. The company has been said to be planning a service for years, though no official word has come from Apple.
Speculation on the streaming service launch picked up earlier this year when researchers uncovered hidden code in the latest test versions of iOS. The unused components referenced versions of a streaming service from Apple.
Should Apple decide to launch its service this year, the unveiling could come as soon as next week when the company holds its annual World Wide Developer Conference. The presentation is believed to include updates on the latest versions of both OS X and iOS.
Posted in Cloud Hosting
Posted on December 27, 2013 at 1:11 pm
New Orleans: Microsoft is making the case for enterprises customers to use its Azure platform for cloud computing, and as an incentive is updating its pricing with no charges for inactive virtual machines and per-minute pricing.
At its TechEd conference, Microsoft pushed hard the promise of consistency across its Windows server and Azure cloud platforms, making the case that it makes more sense for organisations to use this combination when extending their infrastructure out to the cloud.
“Organisations should demand the ability to move virtual machines and workloads across boundaries on demand and without friction,” said Brad Anderson, corporate vice president of Microsoft’s Server and Tools division.
“Azure is based on the same Windows Server 2012 platform as your datacentre, with the same management tools, the same consistency across data, so you can just move things around as you wish,” he added.
From today, Microsoft said it will no longer charge for halted virtual machines, allowing saving customers to save money workloads that are not actively being used. The firm is also moving to per-minute billing rather than per-hour.
Perhaps of more interest to Microsoft customers is the ability for Microsoft Developer Network (MSDN) subscribers to now use any server licenses they have to run on Azure instead of a physical machine.
MSDN subscribers will also qualify for special rates on Azure workloads, which could see up to a 97 percent discount on products such as SQL Server Enterprise edition, and monthly credits that can be used on any Azure resources.
Meanwhile, Microsoft said it was taking the know-how it has gained from operating Azure and rolling this back into its on-premise products.
One example is the Windows Azure Pack, which delivers cloud-like orchestration and scalability across infrastructure based on Windows Server and System Center.
“You can effectively download and install in your private cloud environment a bunch of the features previously only available in Windows Azure,” said Anderson.
Despite claims from other cloud platforms that customers do not want to be tied to a single provider, Anderson touted Microsoft’s credential as the cloud provider of choice because of the sheer scale and reach of its platform.
“You’re looking for a partner can give you reach around the globe. We are deploying hundreds of thousands of new servers per year. We can guarantee SLAs. We give you the flexibility to deploy to the cloud and to bring workloads back in your datacentre if required. As of this week, we have datacentre capacity in China. That gives you support to address that market,” he said.
Posted in Cloud Hosting
Posted on December 25, 2013 at 9:51 am
With the S4 now in the market for the past a month or so, there’s no doubt numerous owners of the former flagship S3 are wondering if it’s worth their time and money upgrading.
This was why our head-to-head review of the two devices was the most popular content of the past week, as we put the two phones through their paces. The S4 won, as you’d expect, but the S3 still remains a high-end device.
Elsewhere the ongoing war of words between Google and Microsoft over customer wins, and losses, took another turn when Redmond took advantage of the raft of negative comments on our story about Pearson embracing Google services.
Mobile users with O2 received a bit of bad news this week when the company confirmed its deal to access 4,000 or so BT Openzone hotspots was coming to an end, although O2’s own WiFi network, in locations such as McDonalds, is growing all the time.
Elsewhere, our hands-on picture blog of HP’s Moonshot technology also gave V3 readers a chance to see the low-power server technology up close and personal, as the firm used the DataCentres Europe 2013 conference as a chance to tout the technology.
Samsung Galaxy S4 vs Galaxy S3 head-to-head review
We find out if Samsung’s latest iPhone competitor is really worth the extra money
Microsoft sticks the boot in over Google App services use at Pearson
Redmond cites unhappy users as evidence that Google is out of its depth in the enterprise market
Samsung Galaxy Note 8.0 review
Tablet struggles to impress with its cheap build and high price tag
O2 ends free WiFi deal with BT
Customers lose access to thousands of hotspots
Microsoft brings anti-botnet fight to the cloud with Azure level-up
Firm looks to make businesses more agile in anti-hacker battle
Microsoft details Windows 8.1 will come with IE11
Windows 8 refresh will be available as a preview from 26 June
Workday CEO to SAP and Oracle: cloud market is passing you by
Aneel Bhusri is in fighting mood as cloud war of words continues
Google gives firms only seven days to come clean on zero-day vulnerabilities
Web firm will support researchers reporting unannounced exploits after a week
HP Moonshot low-power server technology in pictures
V3 gets up close and personal with HP’s low-power server kit
Mobile malware attacks will spread through sensors in handsets
Study describes infections that prey on hardware components
Posted in Cloud Hosting
« Previous Page — Next Page »