Yearly Archives: 2013

EMC World: ViPR debuts for software-defined storage

Posted on September 8, 2013 at 10:21 am

Las Vegas: EMC has launched ViPR, a software-defined storage platform aimed at helping firms better manage virtualised storage.

The company said that the ViPR platform would allow firms to pool multiple hardware units and appliances into a single system which can be centrally controlled. In doing so, the company believes it can better enable firms to manage private clouds.

“What we are essentially doing is providing a layer of software that is going to allow you to manage our existing arrays, third-party arrays and, increasingly, commodity storage,” EMC executive vice president for product operations and marketing Jeremy Burton told attendees at the company’s annual EMC World conference.

The company said that ViPR would operate with two different ‘planes’ for storage management. The control plane will handle high-level management and automation, allowing administrators to perform basic management tasks.

Additionally, ViPR will provide customers with a second, lower-level management layer known as the data plane The company said that the plane would allow for interaction with individual blocks of data, giving a more granular control for administration and management of databases.

Burton said that as applications have changed their approach to handling and utilising object storage, a new system such as the data plane is needed to adapt.

“We think a lot of the new apps that are going to be developed are gong to be built in a different way,” he said.

“We need new controillers for these new content types.”

ViPR is currently undergoing closed tests with customers and is set for general availability in the second half of the year.

Central to the company’s philosophy with ViPR is a push for horizontal integration and the ability to combine multiple platforms from multiple vendors. Chief executive Joe Tucci said that EMC is looking to present its platforms as a counter to the vertical integration approach of rivals such as Oracle.

“Some companies say; use my applications, my operating system, my middleware, right down to the hardware. We are doing it very differently,” Tucci said.

“Yes, we re going to put those technologies together, but you can also piece-pick, you can partner in any place on the chain.”

Posted in Cloud Hosting

Government pushes ‘cloud-first’ policy as 368 firms added to G-Cloud framework

Posted on September 6, 2013 at 8:56 am

The government has officially adopted a “cloud-first” policy at both a central department level and among the wider public sector as part of the ongoing overhaul of the government’s use of IT to drive savings and improve efficiencies.

The government had announced its intention to do this back in March but has now made this a central tenet of its strategy. Whitehall will expect departments to purchase from the G-Cloud as the main resource, with departments having to make a case for any non-cloud deployments.

Cabinet Office minister Francis Maude said: “Many government departments already use G-Cloud, but IT costs are still too high. One way we can reduce them is to accelerate the adoption of cloud across the public sector to maximise its benefits. The cloud-first policy will embed the skills a modern civil service needs to meet the demands of 21st-century digital government and help us get ahead in the global race.”

As part of the announcement, the government has also launched the third version of the G-Cloud framework, adding 368 new firms to the programme, bringing the total number accredited to sell services to the public sector to 708, of which over 80 percent are SMEs.

The government is hoping to use the G-Cloud to boost spending with SMEs after years of public sector IT being dominated by large firms on hugely expensive contracts. G-Cloud programme director Denise McDonagh said the to date the majority of the £18m spend on the CloudStore has been with SMEs, and hopes are that this will continue to rise.
 
“This is still small relative to overall government IT spend, and the transition to widespread purchasing of IT services as a commodity won’t happen overnight,” she said.

“The adoption of a cloud-first policy will give added impetus for Whitehall and the wider public sector to move in this direction – complementing our ongoing work to encourage cloud adoption and to help buyers adapt to this way of purchasing IT, which is already showing results.”

Earlier this year the government celebrated the one-year anniversary of G-Cloud’s launch, although industry watchers said uptake on the platform remains minimal. The government will be hoping its new policy changes that.

Posted in Cloud Hosting

iYogi launches cloud CRM service for remote PC support

Posted on September 4, 2013 at 6:45 pm

Remote support specialist iYogi is launching a cloud computing platform designed to let companies and service providers deliver their own remote management services.

The company said that its Digital Services Cloud customer relationship management (CRM) offering would allow customers to utilise the iYogi support network in their own services. Based on the Yogi support network, the service will allow enterprise and home service providers to offer in-house branded support for end-user PCs.

Larry Gordon, iYogi president of global channel sales, told V3 that the cloud service is a response to demand the company has seen from its partners to open up various components of the iYogi support network, which utilises a combination on locally-installed software and a remote support network to allow technicians to diagnose and repair systems through the cloud.

“The platform we run our comp on has been improving over the last five years,” Gordon explained.

“Some customers want to just buy that for less.”

Though Digital Services Cloud will technically be part of a cloud CRM space dominated by Salesforce.com, the company hopes that the service will become a more specialised platform, fine-tuned specifically for the process of remotely managing PC maintenance. Service providers would be able to increase revenues by offering their support services using the iYogi platform and network.

“What we layer on top of that is this enormous knowledge base built on particular analysis of what tech support is,” Gordon said.

“It layers in that entire layer of big data we have collected on how to monitor these technology problems.”

The company also plans to remain flexible with its pricing. Gordon said that in addition to the traditional cloud subscription offering, customers can opt to purchase access to platform with a one-time fee or pay based on number of users or a portion of the revenues they draw from their own deployments.

Posted in Cloud Hosting

VMware pushes hybrid cloud as stepping stone to IT as a service

Posted on September 2, 2013 at 1:07 pm

VMware sees the hybrid cloud as a key step towards the holy grail of delivering IT as a service (ITaas) for its customers, but organisations must fully automate their own data centres before they can make that step.

At the VMware Forum in London, chief technologist Joe Baguley outlined VMware’s view of the hybrid cloud, and how it proposes to take customers on the journey towards implementing an ITaaS strategy.

Baguley said: “IT is struggling to keep up in the modern era. There’s a growing chasm between the line of business that wants to be agile and the IT department that wants to keep control of everything.”

The solution to this is not only to fully automate the data centre to make it easier to deploy new services, according to VMware, but also to exploit publicly available cloud computing services and integrate these with your on-premise IT.

“We think you should be looking at using public cloud as a natural extension of your own data centre. It should be free and easy to move workloads between those clouds as it is to move them from one rack to another in your data centre,” Baguley said.

However, as Baguley conceded, many firms are a long way from realising this vision, and are still using VMware technology for server consolidation, if they have even got that far.

“The journey to ITaaS is one that many are just beginning. I still get IT guys coming up to me and asking how to virtualise their servers,” he said.

However, if organisations need a good reason to start down this path, VMware can give it to them; its customers have so far saved an estimated total of $10 billion through server virtualisation, Baguley claimed.

“And as you continue to build out the software-defined data centre, we believe you can turn those cost savings back into investments in the business,” he said.

The software-defined data centre is VMware’s existing cloud computing pitch to customers; virtualise everything and then build policy-based automation and orchestration layers so that departmental users do not have to wait for weeks to get a new service provisioned.

VMware customers can get there gradually by incrementally adding layers on top of vSphere, according to Baguley. “Or for those who can’t wait, you can jump straight in with vCloud suite as a single managed package,” he said.

Meanwhile VMware is set to launch on May 21st its vCloud Hybrid Service, which will see partners offer public cloud services based on VMware’s platform.

This will enable VMware customers to realise the hybrid cloud, as they should then be able to move workloads seamlessly between their on-premise vCloud infrastructure and the public cloud operated by providers such as Savvis.

“What we’re doing with partners is enabling a rentable extension to your data centre,” explained Baguley.

However, as Gartner analyst Chris Wolf warned at the time of the vCloud Hybrid Service announcement, VMware could be seen as trying to restrict the public cloud choices its customers can make.

This is a charge that VMware denies, of course, with Baguley stating that it has to be open to keep its customers happy.

“Technology has to be non-disruptive to the customer, but disruptive to the market. It has to be open to everything, even OpenStack,” he said.

Posted in Cloud Hosting

Microsoft signs up 400 million Outlook.com users as Hotmail migration ends

Posted on August 31, 2013 at 7:44 pm

Microsoft has wrapped up its massive Hotmail user migration programme.

The company said that it has now transferred all of the user accounts from Hotmail to its Outlook.com service, ending a data transfer campaign which shifted some 150 petabytes of user data.

“When Outlook.com came out of preview in February, it already had more than 60 million active accounts,” wrote Outlook.com group program manager Dick Craddock

“However, Hotmail was still one of the most widely used services, with over 300 million active accounts. This made the magnitude of the process incredible, maybe even unprecedented.”

With the migration complete, Microsoft now estimates that Outlook.com boasts more than 400 million active users. The milestone comes some two months after the company reported that the service was over 60 million users.

Additionally, Microsoft reported that some 125 million of Outlook.com users also use the Exchange ActiveSync service.

To mark the occasion, Microsoft is posting an update for the Outlook.com service. The features will include a new interface option, the ability to send messages from other accounts without an ‘on behalf of’ notification and deeper integration with SkyDrive.

Craddock said that with the new SkyDrive integration, users will be able to more efficiently select and attach files and images from SkyDrve directly into messages.

“When you’re sending an email message, you can select files from your SkyDrive and we’ll automatically turn those into the right thumbnails with links that have the right permissions tied to people that receive the email,” he explained,

“When you insert pictures from SkyDrive, you automatically get a beautiful photo mail. And it’s easy to edit the message, and add or remove files and pictures right from the new message compose experience.”

Posted in Cloud Hosting

Cloud data protection issues pose a challenge to firms

Posted on August 29, 2013 at 1:31 pm

Businesses moving to the cloud are facing tough challenges around data protection and data use, according to tech giants Philips and insurance firm JLT.

The chief information officer of JLT, Ian Cohen, said the boom of cloud computing services has left the company facing a number of data protection challenges, both with suppliers and customers.

“We operate in a highly regulated environment and trust is vitally important to us,” he said, speaking at a Salesforce customer event on Thursday.

“Not just the implied trust in our brand, but the implied trust that exists between us and our client when we handle their data. And in today’s world increasingly, we are challenged to be explicit about how and where data is accessed.”

Philips vice president Wim Van Gils agreed with this, explaining that the issue of data privacy is particularly relevant to firms using sensitive data.

“We see similar issues with our healthcare business where we’re hit with compliance and a set of security requirements that are enforced by law. When we look at our relationship with consumers we want to be a trusted brand because we’re in the health sector and we never want to compromise that,” he said.

Gils said that many companies’ unwillingness to ask for aid from cloud service providers has caused them to take a misguided, tick-box legal approach to data collection and privacy.

“We want to be very explicit about what information we’re collecting and how we’re using it. Not in some 15-page legal [document] showing what they agree to, we want to bring it up front because we believe it’s one of the foundations of becoming a digital company,” he said. “We need all the help we can get because this is quite new. Most companies are very implicit about it and I think we’re entering an age where we need to be explicit about it.”

Cloud service provider Salesforce’s chief scientist JP Rangaswami said the company is aware of the challenge and is working to create solutions for the privacy problem facing cloud users.

“Data protection is a core concept,” he said. “The phrase people use is informed consent. To get informed consent people need to know what is being collected and how it is being used. The customer needs to be aware of that, they need to know what is being collected.

“This is because it’s not our information. The best we can do is ensure what we hold is solid and that we give our customers the ability to communicate back to their customers.”

However, JLT’s Cohen said that even with the help of bespoke cloud service experts like Salesforce, data privacy issues will continue. “It’s a big issue and Salesforce are helping us but I think there’s more to be done. We need to be even more transparent and to be even more supportive around data privacy, data allocation and data residency and all of these issues.”

The news comes just after Salesforce announced plans to open a new data centre in the UK. The centre will open is Slough in 2014 and is designed to extend the company’s European cloud services.

Posted in Cloud Hosting

Oracle brings HTML5 support to iPad with Secure Global Desktop 5.0 update

Posted on August 27, 2013 at 5:07 pm

Oracle has announced an update to its Oracle Secure Global Desktop platform, which enables workers with an iPad or other mobile device to access applications running on Exalogic Elastic Cloud infrastructure.

Available now, Secure Global Desktop 5.0 extends the platform’s back-end support to provide certified access to Oracle Exalogic Elastic Cloud and web-based Oracle applications, including Oracle CRM, the firm said.

In particular, this version has the capability to deliver applications to an HTML5-compliant browser, so users do not need to download and install a software client. This is currently supported only in the Safari browser on the iPad, however.

Wim Coekaerts, Oracle senior vice president for Linux and virtualisation engineering, said: “Enterprise users expect increasingly more mobile access to applications which are often designed to run on desktop PCs. Oracle Secure Global Desktop provides IT with a highly secure remote access solution for such applications, and even full desktop environments, from tablets.”

Oracle Secure Global Desktop is based on technology that Sun Microsystems acquired from Tarantella. It serves up applications or entire virtual desktops hosted in the datacentre to remote users, with Windows PCs, Macs and Oracle Sun Ray Clients already supported as endpoints.

This version also adds support at the back end for servers running Oracle Solaris 11.1 and Oracle Linux 6.4, the firm said.

New support on the client side includes Windows 8 and OS X Mountain Lion. Supported browsers now include Internet Explorer 10, Chrome, and Firefox Extended Support Release (ESR).

Posted in Cloud Hosting

Amazon talks up AWS certification and security programmes

Posted on August 25, 2013 at 8:50 am

Amazon has launched a certification programme based on its Amazon Web Services AWS cloud computing platform.

The company said that its AWS Certification Program would seek to provide credentials for developers and administrators looking to oversee the development and operation of AWS cloud and application deployments.

The certification programme will include a series of exams which developers can take at local testing facilities. Upon passing the exams, which include cloud platform architecture, operations, administration and development, the company will issue certifications which verify exppertise in various area of AWS products.

Amazon said that by the end of the year it hopes to be offering ist full range of exams at some 750 testing centres by the end of the year.

The company said that its main goal with the programme was to provide customers a means of verifying that employees were able to properly build and administer their AWS deployments.

“With cloud computing being quickly adopted by organizations of all sizes around the world, in-depth training programs as well as certifications for individuals who have demonstrated competence with AWS are increasingly important,” said Amazon Web Services vice president Adam Selipsky.

“The AWS Certification Program helps organizations identify that the employees, partners and consultants they depend on for their AWS solutions are well-versed in the best practices of building cloud applications on AWS and have the skills to help them be successful.”

Amazon is not the only firm looking to provide training and accreditation to professionals. While the rise in cloud computing has brought a number of new services to the market and given rise to additional fields such as big data analysis, vendors have begun to find themselves facing a shortage of qualified administrators.

Firms such as EMC and IBM have sought to help universities train students on managing their platforms.

At the lower levels, the UK Department for Education is exploring its own initiatives to update and modernise computing curriculum to better prepare students.

Posted in Cloud Hosting

Brocade introduces on-demand datacentre strategy

Posted on August 23, 2013 at 9:08 am

Brocade has unveiled a strategy to deliver what it calls the on-demand datacentre through a combination of physical and virtual networking to overcome the limitations of legacy networks.

The company said that by combining physical and virtual networking, users can create a network that reduces complexity and offers scalable virtualisation on-demand.

Brocade claimed that its new initiative will reduce the barriers to entry for companies looking to adopt a software-defined network (SDN) strategy.

“The On-Demand Data Center strategy from Brocade provides an end-to-end solution that spans the physical, virtual and orchestration layers of the datacentre,” Kelly Herrell, vice president and general manager of software networking at Brocade, told V3.

“It brings advanced technologies into play in a pragmatic and evolutionary way, offering a unique path toward software-defined networking (SDN). With Brocade, customers can build upon their current infrastructure investments while moving toward the next evolution in networking.”

Brocade’s strategy aims to create offerings that can use both physical and virtual networking tools. The firm says that the offerings will allow users to mix and match both types of networking options, offering greater flexibility to quickly deploy cloud-based services.

According to the firm, the combination can better handle large-scale multi-tenancy better than legacy networks. Brocade says that the combination allows users to see all servers as a constantly growing shared resources pool.

Brocade also claimed that the combination enables reduced overhead and shorter deployment times through self-service provision models.

Along with the on-demand strategy, Brocade has launched a variety of updated offerings. These include Brocade’s recently announced VCS fabric plug-in. The plug-in offers users the chance to create on-demand provisioning capabilities for OpenStack clouds.

Herrell says that building tools for open networks like OpenStack is an important key in the new era of the market.

“Open network solutions are a new era for the industry. The advantages can be summed up in two words: choice and speed. For choice, openness facilitates solution and vendor interoperability. This allows the customer to select the right tools for the right jobs instead of being force-fed what a single vendor offers,” continued Herrell.

“For speed, this is the benefit gained when open industry collaboration advances technologies faster than what any single vendor acting alone can produce. This naturally improves the speed with which datacentre architecture can adapt to new business pressures.”

Brocade has long been a supporter of open technologies such as OpenStack and OpenDaylight. Herrell says that open initiatives allow for companies to plan for growth in the datacentre and prevents vendor lock-in.

“Customers need to drive their datacentre strategies forward with urgency. They can’t wait; they need to do their architecting and planning now, using the most open and advanced tools they can get their hands on,” continued Herrell.

“Open network solutions such as OpenStack and OpenDaylight are constantly evolving and improving as the ecosystem collaborates on advancing the state of the technology. Importantly, this advancement is being done in an open environment which eliminates vendor lock-in.”

OpenStack has been a major focus on news recently. Red Hat executives recently warned that the platform was the only way to guarantee interoperability between different cloud operators.

Posted in Cloud Hosting

SoftLayer offers high-performance Riak cloud database service

Posted on August 21, 2013 at 10:58 am

Hosting firm SoftLayer has introduced a pay-as-you-go database service based on the open-source Riak NoSQL engine, claiming to offer a turnkey environment for customers developing applications around big data.

Available now, the Riak and Riak Enterprise services have been developed through a partnership between SoftLayer and Basho, creator of the Riak database itself.

The end result combines the high availability, fault tolerance and scalability of Riak with the flexibility and ease of access of SoftLayer’s on-demand infrastructure, the two firms said.

“We are offering the ability for customers to very easily design their own Riak solution via a Solution Designer tool we have produced that makes it easy to order a multi-server configuration,” Marc Jones, vice president of product innovation at SoftLayer told V3.

However, the Riak offering is not strictly a cloud service in the same way as Amazon’s DynamoDB platform, as customers are getting their own dedicated hardware that they can configure how they like using the Solution Designer tool.

“In terms of rapid provisioning, consumption-based billing and the ability to scale, we regard this as being cloud, especially when compared with traditional IT provisioning,” argued Jones.

And while the SoftLayer approach may take longer to provision (up to two hours) than something like DynamoDB, it offers users much greater flexibility in tailoring the infrastructure to an application’s exact requirements.

Customers can choose between small or medium servers, which offer up to four or up to 12 drives respectively, and also specify memory configurations. Local storage options comprise Serial ATA (SATA), Serial attached SCSI (SAS) or solid-state drives, with Raid support for data resilience at each node, while each node is connected by twin Gigabit connections bonded together to provide 2Gbit/s of network bandwidth.

Meanwhile, the Riak Enterprise version of the service offers customers the ability to replicate their database cluster across multiple SoftLayer datacentres around the globe, to serve traffic in multiple regions or simply for disaster recovery and failover requirements.

SoftLayer operates its own private network between its 13 global datacentres, according to Jones, so there is no additional charge for replication traffic. Riak is being aimed at web businesses building applications that may generate a lot of data for analysis, according to SoftLayer, such as gaming sites or social media.

“These are verticals driven in a large part by the data that they capture from their customers, and they are able to derive analytics from this data that ultimately drives their business forward,” said Jones.

SoftLayer is thus aiming to attract customers looking for a robust platform to store their critical data, and such customers could well end up using SoftLayer as their main hosting provider in order to cut network latency between their web tier and the Riak database back-end.

Pricing for Riak starts at $359 (£231) per server per month, while Riak Enterprise is $600 (£386) per node per month or an annual price of $6,000 (£3,869) per node.

Posted in Cloud Hosting

« Previous PageNext Page »