Posted on August 25, 2013 at 8:50 am
Amazon has launched a certification programme based on its Amazon Web Services AWS cloud computing platform.
The company said that its AWS Certification Program would seek to provide credentials for developers and administrators looking to oversee the development and operation of AWS cloud and application deployments.
The certification programme will include a series of exams which developers can take at local testing facilities. Upon passing the exams, which include cloud platform architecture, operations, administration and development, the company will issue certifications which verify exppertise in various area of AWS products.
Amazon said that by the end of the year it hopes to be offering ist full range of exams at some 750 testing centres by the end of the year.
The company said that its main goal with the programme was to provide customers a means of verifying that employees were able to properly build and administer their AWS deployments.
“With cloud computing being quickly adopted by organizations of all sizes around the world, in-depth training programs as well as certifications for individuals who have demonstrated competence with AWS are increasingly important,” said Amazon Web Services vice president Adam Selipsky.
“The AWS Certification Program helps organizations identify that the employees, partners and consultants they depend on for their AWS solutions are well-versed in the best practices of building cloud applications on AWS and have the skills to help them be successful.”
Amazon is not the only firm looking to provide training and accreditation to professionals. While the rise in cloud computing has brought a number of new services to the market and given rise to additional fields such as big data analysis, vendors have begun to find themselves facing a shortage of qualified administrators.
Firms such as EMC and IBM have sought to help universities train students on managing their platforms.
At the lower levels, the UK Department for Education is exploring its own initiatives to update and modernise computing curriculum to better prepare students.
Posted in Cloud Hosting
Posted on August 23, 2013 at 9:08 am
Brocade has unveiled a strategy to deliver what it calls the on-demand datacentre through a combination of physical and virtual networking to overcome the limitations of legacy networks.
The company said that by combining physical and virtual networking, users can create a network that reduces complexity and offers scalable virtualisation on-demand.
Brocade claimed that its new initiative will reduce the barriers to entry for companies looking to adopt a software-defined network (SDN) strategy.
“The On-Demand Data Center strategy from Brocade provides an end-to-end solution that spans the physical, virtual and orchestration layers of the datacentre,” Kelly Herrell, vice president and general manager of software networking at Brocade, told V3.
“It brings advanced technologies into play in a pragmatic and evolutionary way, offering a unique path toward software-defined networking (SDN). With Brocade, customers can build upon their current infrastructure investments while moving toward the next evolution in networking.”
Brocade’s strategy aims to create offerings that can use both physical and virtual networking tools. The firm says that the offerings will allow users to mix and match both types of networking options, offering greater flexibility to quickly deploy cloud-based services.
According to the firm, the combination can better handle large-scale multi-tenancy better than legacy networks. Brocade says that the combination allows users to see all servers as a constantly growing shared resources pool.
Brocade also claimed that the combination enables reduced overhead and shorter deployment times through self-service provision models.
Along with the on-demand strategy, Brocade has launched a variety of updated offerings. These include Brocade’s recently announced VCS fabric plug-in. The plug-in offers users the chance to create on-demand provisioning capabilities for OpenStack clouds.
Herrell says that building tools for open networks like OpenStack is an important key in the new era of the market.
“Open network solutions are a new era for the industry. The advantages can be summed up in two words: choice and speed. For choice, openness facilitates solution and vendor interoperability. This allows the customer to select the right tools for the right jobs instead of being force-fed what a single vendor offers,” continued Herrell.
“For speed, this is the benefit gained when open industry collaboration advances technologies faster than what any single vendor acting alone can produce. This naturally improves the speed with which datacentre architecture can adapt to new business pressures.”
Brocade has long been a supporter of open technologies such as OpenStack and OpenDaylight. Herrell says that open initiatives allow for companies to plan for growth in the datacentre and prevents vendor lock-in.
“Customers need to drive their datacentre strategies forward with urgency. They can’t wait; they need to do their architecting and planning now, using the most open and advanced tools they can get their hands on,” continued Herrell.
“Open network solutions such as OpenStack and OpenDaylight are constantly evolving and improving as the ecosystem collaborates on advancing the state of the technology. Importantly, this advancement is being done in an open environment which eliminates vendor lock-in.”
OpenStack has been a major focus on news recently. Red Hat executives recently warned that the platform was the only way to guarantee interoperability between different cloud operators.
Posted in Cloud Hosting
Posted on August 21, 2013 at 10:58 am
Hosting firm SoftLayer has introduced a pay-as-you-go database service based on the open-source Riak NoSQL engine, claiming to offer a turnkey environment for customers developing applications around big data.
Available now, the Riak and Riak Enterprise services have been developed through a partnership between SoftLayer and Basho, creator of the Riak database itself.
The end result combines the high availability, fault tolerance and scalability of Riak with the flexibility and ease of access of SoftLayer’s on-demand infrastructure, the two firms said.
“We are offering the ability for customers to very easily design their own Riak solution via a Solution Designer tool we have produced that makes it easy to order a multi-server configuration,” Marc Jones, vice president of product innovation at SoftLayer told V3.
However, the Riak offering is not strictly a cloud service in the same way as Amazon’s DynamoDB platform, as customers are getting their own dedicated hardware that they can configure how they like using the Solution Designer tool.
“In terms of rapid provisioning, consumption-based billing and the ability to scale, we regard this as being cloud, especially when compared with traditional IT provisioning,” argued Jones.
And while the SoftLayer approach may take longer to provision (up to two hours) than something like DynamoDB, it offers users much greater flexibility in tailoring the infrastructure to an application’s exact requirements.
Customers can choose between small or medium servers, which offer up to four or up to 12 drives respectively, and also specify memory configurations. Local storage options comprise Serial ATA (SATA), Serial attached SCSI (SAS) or solid-state drives, with Raid support for data resilience at each node, while each node is connected by twin Gigabit connections bonded together to provide 2Gbit/s of network bandwidth.
Meanwhile, the Riak Enterprise version of the service offers customers the ability to replicate their database cluster across multiple SoftLayer datacentres around the globe, to serve traffic in multiple regions or simply for disaster recovery and failover requirements.
SoftLayer operates its own private network between its 13 global datacentres, according to Jones, so there is no additional charge for replication traffic. Riak is being aimed at web businesses building applications that may generate a lot of data for analysis, according to SoftLayer, such as gaming sites or social media.
“These are verticals driven in a large part by the data that they capture from their customers, and they are able to derive analytics from this data that ultimately drives their business forward,” said Jones.
SoftLayer is thus aiming to attract customers looking for a robust platform to store their critical data, and such customers could well end up using SoftLayer as their main hosting provider in order to cut network latency between their web tier and the Riak database back-end.
Pricing for Riak starts at $359 (£231) per server per month, while Riak Enterprise is $600 (£386) per node per month or an annual price of $6,000 (£3,869) per node.
Posted in Cloud Hosting
Posted on August 19, 2013 at 1:16 pm
Enterprise storage firm Box has announced plans to hire 100 new employees in EMEA and move to larger officers in London as it continues to grow rapidly.
The firm first moved to London last June but has quickly outgrown that space and is now set to move to new offices as part of its plans to hire new staff.
“We have a goal to hire 100 people in EMEA by the end of the year and have already hired 50 employees since we launched last June,” European manager David Quantrell said in a statement to V3.
“This summer, our London team will be moving into a larger facility that caters for 150-plus employees with an option to double the size of the space as we continue to grow.”
The European market is the next major area for Box as it also plans French and German office moves but London is to the remain the hub for the firm’s operations.
“Box is investing millions into its London team expansion and the European market. It’s critical to our success and we’ve centralised our European efforts out of London,” added Quantrell.
The London expansion by Box was first revealed by chancellor George Osborne at an event attended by V3 on Thursday.
Box is becoming a major player in the enterprise market as its storage platform offers a dedicated corporate version of tools such as Dropbox and Google Drive with more security and policy control processes in place.
Earlier this year it announced a sizeable customer win of 50,000 seats from Schneider Electric. V3 interviewed Box chief executive Aaron Levie in the last issue of the V3 Tablet App, which you can download for free simply by registering on the V3 website.
Posted in Cloud Hosting
Posted on August 17, 2013 at 5:06 pm
Red Hat is warning businesses that an open cloud approach is the only way to avoid lock-in, claiming that only the OpenStack platform will deliver interoperability between clouds operated by different service providers.
At a roundtable event in London, Red Hat executives outlined their vision for the open cloud and how it can deliver on the hybrid cloud vision where organisations will be free to run workloads on-premise or in the public cloud, as best meets their needs.
Red Hat is enthusiastically backing OpenStack for its cloud offerings, including those for enterprise customers building a private cloud and those aimed at service providers seeking to deliver public cloud services.
“OpenStack is an open framework under which we can realise the dream. We’ve become part of a flourishing community that is developing these standards around cloud governance, because we realise that no single company is going to get us there,” said Red Hat chief technology officer, Brian Stevens.
With Red Hat one of the leading enterprise Linux distributors, OpenStack is an obvious fit for its existing strategy. The firm is following a similar approach to that of its Red Hat Enterprise Linux (RHEL) in carefully testing and certifying its build of OpenStack before releasing it to customers as a fully supported product.
Called Red Hat OpenStack, this will be based on the latest Grizzly version of the OpenStack code and follow it by three months, according to Stevens, meaning it is set for general availability in early July.
Meanwhile, Red Hat unveiled this month a parallel community-supported distribution of OpenStack called RDO. This freely available distribution will act as an incubator for upcoming technologies in OpenStack, in the same way that the Fedora Linux build does for technologies destined to be included in RHEL.
Stevens said that Red Hat is seeing as much interest in adopting OpenStack at the service provider level as among its enterprise customers.
“It’s almost impossible to meet a telco now who doesn’t have an OpenStack strategy,” he claimed.
Providers of public cloud services thus seem to be coalescing around OpenStack, save for Amazon and Microsoft who have their own proprietary platforms, plus a small number that are operating VMware-based services.
According to Red Hat, standardising on OpenStack at the enterprise and service provider end will make it much easier to link up the private and public cloud infrastructure to deliver a hybrid cloud strategy.
Posted in Cloud Hosting
Posted on August 15, 2013 at 9:44 am
Amazon is reportedly working to develop a set-top box which would allow the company to serve users with streaming video.
Bloomberg cited company sources in reporting that the firm was working on a branded device which would allow users to access its video streaming services.
The report did not mention what possible services Amazon could offer with the device in addition to its own streaming video platforms. The company currently offers its video player software on a number of home entertainment devices and gaming consoles.
Amazon has given no official word on the development or possible release of the device.
Such a launch could however, put Amazon in direct competition with the biggest names in the home entertainment market. Apple offers its own video services through Apple TV, while the Sony Playstation, Microsoft Xbox and Nintendo Wii brands also offer support for streaming services including Netflix and Amazon’s own video player.
Outside of the gaming consoles, home entertainment boxes have yet to truly catch on in the market. Apple’s TV box has long been an afterthought in the company’s hardware line, and dedicated streaming boxes such as the Roku player have only begun their mission to crack the consumer video market.
The move could also mark another step by Amazon to transition itself from a web-based retailer and service provider to a hardware vendor and home entertainment heavyweight.
In addition to multi-billion dollar retail service and AWS enterprise operations, the company has built a name for itself in the tablet space with the success of its Kindle tablet line.
Posted in Cloud Hosting
Posted on August 13, 2013 at 2:42 pm
Ubuntu developer Canonical has officially announced the latest release of its Linux platform, Ubuntu 13.04, which delivers enhancements for those using the operating system to build an OpenStack cloud.
Available for download from tomorrow, Ubuntu 13.04, codenamed Raring Ringtail, introduces several enhancements on the server side aimed at cloud computing, including integration of the latest OpenStack Grizzly update that was pushed out earlier this month. Other improvements include an overhaul of the Juju orchestration tool, integration with the Ceth open-source storage technology, and an update of the Floodlight OpenFlow controller for software defined networking.
Mark Baker, server product manager at Canonical, told V3 many of the changes introduced in 13.04 are laying the foundations for the next long term support (LTS) release version of Ubuntu, for which Canonical guarantees maintenance and security updates for a period of five years.
“People deploying OpenStack cloud are doing so primarily on the LTS releases, so this release and [the upcoming] 13.10 are really the proving ground to prepare for 14.04 LTS, set for April 2014,” Baker said, although he added that the latest version is a stable build ready for production use.
To this end, Canonical is looking to align Ubuntu’s cloud support around three main priority areas for datacentre users, scale-out storage solutions, networking and compute technology, according to Baker.
For the compute part, Ubuntu integrates OpenStack Grizzly with 13.04 and makes greater use of Juju to enable administrators to deploy OpenStack in a highly available way. This means removing single points of failure, setting up failover for the database components, and adding other redundancy measures. Juju itself also now has a richer GUI that helps administrators visualise the services they are deploying and the relationships between them, Baker said.
For storage, Ubuntu 13.04 also now integrates the Ceph open-source distributed storage system to provide a scalable block, object and Posix-compliant file system.
“We’ve seen interest from our users in operating that as part of OpenStack for object and block storage,” Baker said, explaining that it offers an alternative to OpenStack’s own Swift and Cinder modules but enables both block and object storage on the same platform.
Meanwhile, Ubuntu 13.04 also includes an updated version of the open-source Floodlight OpenFlow controller, designed to control both physical and virtual network switches that support the OpenFlow protocol.
“We’ve been including Floodlight on Ubuntu for a little while, but it’s gone through a bit of an update that is a reasonable step up in terms of functionality and features,” said Baker.
This provides OpenStack users with an open-source alternative to the Nicira NVP technology, which Canonical and VMware recently enabled support for in OpenStack, Baker added.
“This is driven by the desire, as we head towards 14.04 to have more robust open-source options available for people. While Nicira is great technology, it is proprietary, and you have to pay VMware’s prices to use it,” he said.
Posted in Cloud Hosting
Posted on August 11, 2013 at 1:02 pm
Amazon Web Services (AWS) is making the case for its cloud platform as a driver of business innovation, saying that as the cost of using its infrastructure falls, so does any risk associated with a new venture.
The firm also argues that AWS is now a mature and robust enough platform for enterprise workloads, citing some customers using its infrastructure to operate even mission-critical applications.
At the AWS Summit in London, chief technology officer Werner Vogels said the cloud platform has had a fundamental impact on how IT has evolved since it launched in 2006. He stressed the firm’s commitment to openness and value as reasons for the success of AWS.
“We do not lock you in to any type of technology. You can choose any operating system and any application; you can run them all on AWS. There is no contract to force you to be our customer for say, five years, and this means we need to be on our toes – if you not satisfied, you can just walk away,” he said.
As Amazon continues to expand, this drives economy of scale and cuts costs, which the firm passes on to customers to keep them happy, with some customers seeing a 40 percent reduction in their bills at start of 2012. But this also helps to ensure customer success, according to Vogels.
“If we can get the cost of computing down low enough that you don’t need to worry about it, then the type of new applications we can help create will be enormous. Our aim is make infrastructure so cheap that it will drive innovation,” he said.
Vogels claimed that economics rather than technology is driving cloud uptake, with customers realising that they can gain access to IT resources quickly without any purchase cost, and only pay for what they use.
“You increase innovation when the cost of failure approaches zero, and so you can stop wasting money on IT, and spend it on the things that really matter for your business – building better products,” he said.
Posted in Cloud Hosting
Posted on August 9, 2013 at 4:55 pm
A recent study found that big data and cloud software lead the enterprise software market to $342bn in revenue last year.
According to the study from research firm IDC, the enterprise software market jumped over three percent year-over-year. The firm reports that software categories that include big data and cloud programmes made up a majority of the markets revenue.
“The global software market, comprised of a multi-layered collection of technologies and solutions, is growing more slowly in this period of economic uncertainty. Yet there is strong growth in selective areas,” said senior vice president for worldwide software, services and executive advisory research at IDC Henry D. Morris.
“The management and leveraging of information for competitive advantage is driving growth in markets associated with big data and analytics. Similarly, rapid growth in cloud deployments is fueling growth in application areas associated with social business and customer experience,”
IDc says application development and deployment (AD&D) software such as data analytics and data management tools made up a major part of the growth. The firm found that AD&D software made up 24 percent of total enterprise software revenues in 2012.
Year-over-year the sector grew over four percent in 2012. Oracle was reported to be the most profitable AD&D software firm. The firm owned over 21 percent of the market.
Overall, Microsoft was the biggest supplier of enterprise software overall. Redmond had an over 17 percent market share in 2012, according to IDC. Microsoft was followed by IBM, Oracle, and SAP in the sector.
IBM and Microsoft enterprise software revenues stayed pretty consistent in 2012 with about a one percent growth rate. However, Oracle and SAP saw their market shares grow slightly larger.
Oracle saw enterprise software revenues jump over three percent in 2012. While SAP saw its revenues grow over five percent.
SAP’s growth in the market may be attributable to major cloud software push in 2012. The firm reported that its cloud software was used by over 17 million users last November.
Oracle also made a widespread push into the cloud last year. During its OpenWorld event last October, Oracle launched a slew of cloud software options to the market.
The firm has been playing catch up in the cloud marketplace for the past few years. 2012 marked one of the firms larger pushes into the sector.
Posted in Cloud Hosting
Posted on August 7, 2013 at 12:01 pm
Shell is undertaking a huge bring your own device (BYOD) project at its organisation, which will soon see the firm supporting around 135,000 devices picked by users rather than dictated by the IT department.
At the CA World show in Las Vegas on Monday, Ken Mann, enterprise information security architect at the oil and gas firm, outlined Shell’s shift to become a cloud-first and BYOD outfit.
Shell had already undertaken a project to centralise all its IT, and has outsourced its infrastructure to three main suppliers – AT&T, EDS – since purchased by HP -and T-Systems. Two years ago, the firm adopted a cloud-first policy, which means that any new applications have to be in the cloud unless there is a business case for them to be on-premise.
The next project for Mann’s department was BYOD – which Mann’s boss defines as buy rather than bring your own device.
The BYOD scheme is a major undertaking. Shell has 90,000 permanent employees, and an additional 60,000 on a contract basis so the company is managing 150,000 clients, from desktops to portables to tablets.
Of those users, 10,000 are already on a BYOD scheme, but Mann said Shell expects that in a few years, less than 10 percent of its users will be using company-provided IT equipment. Or taken another way, Shell will soon have 135,000 BYOD users to support.
“We’re looking at true BYOD, not just for mobile, but bring in your own laptop,” he said.
“Windows, iOS and Android are key operating systems for us, but if Windows Phone 8 becomes popular, we’ll look into using that.”
Part of the decision for the BYOD drive is around recruitment and staffing.
“In about five to 10 years, 50 percent of our staff worldwide will retire,” Mann explained.
“We’re going to have a lot of people turning over, and we want to be able to attract and retain talented and young staff. They don’t want to come into a locked corporate environment.”
To support this major BYOD drive, Mann’s job was to secure the different devices accessing the corporate network.
“We had two-factor authentication using smartcards and one time passwords (OTP) as default. But we started to look at how we could do two-factor authentication in the cloud. We wanted a solution for single sign-on from any device, whether in the cloud or an in-house app, and we wanted to support authentication standards like SAML and OAuth and translate between these,” he explained.
“We also wanted device authentication – is it from a Shell device or a kiosk in an airport”
Mann said that four IT companies were in the running to provide Shell with its desired cloud authentication system, and each was visited to carry out an on-site proof of concept, with CA being one of the four.
“We didn’t find one company that could do everything we wanted to do. CA showed us the guts and development code, but they didn’t have a solution ready at the time,” he noted.
“Based on the four firms, we ended up selecting CA CloudMinder – it didn’t have a name at the time – as it was highly focused on cloud apps, and we’re already using SiteMinder, which focuses on in-house authentication, so there was a good bridge to link cloud and on-premise apps.”
CA CloudMinder was released in February, and is designed to offer enterprises key security capabilities including advanced authentication, identity management, and federated single sign-on as cloud services.
CA also unveiled a partnership with SAP at the Las Vegas show, to license the latter’s Afaria software for mobile device management.
Posted in Cloud Hosting
« Previous Page — Next Page »