Cloud Hosting

Dropbox hits 175 million users as firm predicts death knell for the hard drive

Posted on April 2, 2014 at 11:34 am

Cloud storage firm Dropbox has added 75 million users since October, giving it a total of 175 million users, as it looks to flex its muscles among IT vendors and consumers alike by vowing to replace the hard drive.

The announcement of the new user numbers was made Dropbox’s first developer conference, called DBX, in California.

At the event Dropbox announced that it is expanding its API roster that is already used by developers worldwide, adding a Datastore API, which can sync documents and pictures, but also keeps track of app data such as settings in cross-platform apps, such as to-do list apps that run on multiple operating systems.

A Dropbox blog post explained: “We’ve designed the Dropbox platform to be the best foundation to connect the world’s apps, devices and services. We started with the Sync API, which let developers take advantage of the file syncing technology that took us years to get right.

“Today we’re announcing a suite of tools that fundamentally simplifies how developers can build across devices and platforms.”

Dropbox chief executive Drew Houston added that the demand for the firm’s cloud storage services was removing the need for physical hard drives.

“I don’t mean that you’re going to unscrew your MacBook and find a Dropbox inside, but the spiritual successor to the hard drive is what we’re launching,” he told Wired.

At the end of June, Dropbox also began to shake off its consumer-only reputation, allowing IT vendors to start selling Dropbox products through the Partner Network program. The company already has 150 resellers on its books as it looks to grow this list further.

Dropbox has integrated its services into several major tech brands, including Yahoo Mail, which allows users to send attachments larger than 25MB.

Posted in Cloud Hosting

Citrix enables remote desktop delivery from Windows Azure with XenDesktop 7

Posted on March 31, 2014 at 8:48 pm

Citrix can now enable enterprise customers and service providers to deliver Windows desktop sessions from Microsoft’s Windows Azure cloud computing platform using XenDesktop 7, the latest release of its desktop virtualisation system.

Announced today, the ability to run XenDesktop 7 in the cloud gives enterprises greater flexibility in extending capacity on demand, reduces lead-time for procurement, and helps minimise risk as loads fluctuate, according to Citrix. The firm has a long-standing partnership with Microsoft on virtualisation, management, and remote access tools.

The move follows recent licensing changes from Microsoft to allow the use of Windows Server Remote Desktop Service (RDS) provider licenses on Windows Azure. Until now, this prevented Windows Server-based desktops from being hosted on Microsoft’s cloud platform and accessed via RDS. As of 1 July, Citrix XenDesktop 7 and XenApp 6.5 support this scenario in Azure, Citrix said.

According to the firm, demand for cloud-hosted desktops is on the rise in small and medium businesses to meet the need for cost-effective IT services that enable a mobile workforce. By using Azure, service providers can meet these needs while focusing on service delivery rather than managing the underlying infrastructure.

Meanwhile, for enterprise customers, remote desktops hosted on Windows Azure offer the benefit of deployment in an environment familiar to Windows administrators and optimised for Windows workloads, the firm said.

Citrix group vice president for Desktops and Apps Bob Schultz said: “Customers are increasingly adopting cloud hosted session desktops to enable mobile workstyles and simplify their operations. Leveraging the cloud as a deployment platform will further accelerate these major trends.”

XenDesktop 7 was released at Citrix’s Synergy event in May, and integrates the XenApp application delivery platform with its existing virtual desktop capabilities to provide a unified framework for app and desktop delivery with a single management console.

“As a leading desktop virtualisation product, XenDesktop 7 enables users to have a high definition experience across a broad set of user-defined devices. We are continuing to work with Microsoft to allow both IT and service providers to choose Windows Azure for their Citrix deployments,” Schultz said.

Posted in Cloud Hosting

PRISM spy storm could see UK and US cloud computing customers flee overseas

Posted on March 29, 2014 at 8:14 am

Almost a month ago, allegations of widespread NSA surveillance sent shockwaves around the world. Cautious web users started thinking a little more carefully about what they do online, with social networks such as Facebook coming under fire for allegedly providing ‘backdoor’ access for security services. Even the EU had to begin considering holding meetings in the Brussels sunshine as “disturbing” claims emerged accusing the NSA of bugging EU offices and hacking EU computers.

One of the other lasting consequences could be a dwindling trust in secure cloud providers. Neelie Kroes’ EU speech warning of multi-billion euro consequences for cloud providers highlighted a very interesting point. In the wake of this scandal, there is an enormous opportunity for IT vendors to sell beefed-up secure servers to clients who hold sensitive information.

Kroes’ extensive work on the EU’s Digital Agenda puts her right up there as one of the most respected technology speakers in Europe, and her words on the cost of the scandal carry a lot of weight. “If businesses or governments think they might be spied on, they will have less reason to trust the cloud, and it will be cloud providers who ultimately miss out,” she said. “In this case it is often American providers that will miss out, because they are often the leaders in cloud services.”

Indeed, European corporations have always eyed US-based cloud companies with a little suspicion; taking all of your data and outsourcing it to a huge cloud corporation does rather take away control, something many CIOs do not enjoy. 

But it’s not just providers across the pond who are going to suffer. In a discussions in the House of Commons last week, former shadow home secretary David Davis alluded to the fact that countries with more stringent privacy laws, such as Germany, would see a benefit when it comes to data hosting, while countries the UK, where data spying concerns now exist, would see interest disappear. He also made very clear that he thinks current UK law was “completely useless” for UK citizens.

Countries outside of the EU also reportedly benefit from this. Switzerland, for example is well-known for its offshore-style private banking system, and this attitude is reflected in the IT sector. Outside of the EU, pan-European agreements for data sharing do not apply, with data only accessible after liability or guilt is proven.

Simply, in the same way anonymous search engines such as DuckDuckGo are rapidly gaining popularity, cloud solutions based in countries with better privacy laws seem to be gathering momentum.

One such company benefitting from a heightened level of paranoia is Artmotion, which touts itself as Switzerland’s biggest offshore cloud provider. The company saw a 45 percent rise in demand for secure cloud services in the wake of the PRISM scandal, and while correlation does not mean causation, it certainly made for interesting reading.

With companies ranging from tobacco to tech, from oil to security firms, security and data privacy sit right at the top of their priorities list. Mateo Meier, the company’s chief executive isn’t surprised by the rise in demand for secure cloud systems outside of the EU.

“We have many clients saying they don’t know what the government is doing, and those companies usually pay very little attention to the details of privacy,” he explained. A staggering 80 percent of companies who host with Artmotion turned to them after a security breach, so turning to secure solutions seems to be much more like damage control than an initial priority.

In particular, Meier is noticing a large influx of UK-based companies requesting his services; he estimates in recent weeks that roughly 25 percent of calls have come from UK numbers.

“We fight for protection and honest business, and I think that’s going to continue for the next few years,” explains Meier. “Privacy is important but it will be more so in five or ten years. Money is replaceable but data is not.”

The only problem with this is that just because laws exist doesn’t mean they won’t be broken. If proven true, the spying and wiretapping allegations from the last month are very close to the edge of the law, if not rather over it. The fact that strict privacy laws exist in Germany and Switzerland does not mean that government teams won’t give it a go; it’s just that the diplomatic consequences would be rather stronger with these laws in place.

If the revelations of unabated data access continue, it’s entirely feasible that cloud vendors in countries without strict data laws could see their customer base dented. It just depends on how many more nasty surprises there are to come.

Posted in Cloud Hosting

PRISM: EU warns of serious consequences for cloud computing vendors in the wake of spy scandal

Posted on March 27, 2014 at 8:15 pm

Neelie Kroes has warned of “multi-billion euro consequences” for cloud service providers if customers can no longer trust their security measures in the wake of the PRISM hacking scandal.

“If businesses or governments think they might be spied on, they will have less reason to trust the cloud, and it will be cloud providers who ultimately miss out,” said Kroes, the vice president of the European Commission (EC) who speaks regularly on digital issues.

She said that customers who allow their cloud suppliers to hold sensitive information will find themselves in a difficult position: “Why would you pay someone else to hold your commercial or other secrets, if you suspect or know they are being shared against your wishes” she asked.

Customers would see sense and look elsewhere, according to Kroes, with US vendors bearing the brunt of the damage: “Front or back door – it doesn’t matter – any smart person doesn’t want the information shared at all,” she said. “Customers will act rationally, and providers will miss out on a great opportunity.

“In this case it is often American providers that will miss out, because they are often the leaders in cloud services.”

But she said that while privacy is a “fundamental right”, it should not be down to policy makers to produce legislation for cloud providers to put stronger security measures in place. Instead, she said in the interest of an open and competitive market, cloud vendors should put their own security measures in place as they see fit, saying “privacy is not only a fundamental right, it can also be a competitive advantage.”

“Companies focused on privacy need to start coming forward into the light… 2013 is the year,” she concluded.

This week, allegations emegered suggesting EU buildings were bugged and EU computer equipment was hacked, with the EC labelling the incidents “deeply disturbing”.

Posted in Cloud Hosting

Microsoft previews Windows Server 2012 R2 Essentials with cloud and virtualisation enhancements

Posted on March 25, 2014 at 11:47 am

Microsoft has announced a preview of Windows Server 2012 R2 Essentials, the upcoming version of its server line targeted at SMEs.

While the new version continues to focus on offering value for smaller businesses, Microsoft said it has added capabilities to allow it to scale to larger deployments, while making it easier for customers to take advantage of virtualisation and cloud services.

Microsoft unveiled previews for its mainstream Windows Server 2012 R2 releases at its TechEd conference last month, but the Essentials R2 preview has only just been pushed out.

Essentials itself was only officially released last October, replacing the former Windows Small Business Server (SBS) product as Microsoft’s small business offering.

New features include tools to make it easier for small to medium enterprises (SMEs) to use virtualisation and cloud computing resources, and a revamped Dashboard to keep management as simple as possible, Microsoft said.

Writing on Microsoft’s blog, Jason Anderson, group programme manager for the Windows Server Essentials team, said that Windows Server 2012 R2 Essentials continues to be the best option for small businesses that need to have a physical server.

“One of the reasons SMBs are hesitant to move to virtualisation has to do with the complexity of properly configuring and managing virtual instances of the operating system,” he said.

To address this, Essentials R2 makes it possible to host a guest instance of itself that can be set up manually or via a simple wizard to guide customers through the setup process.

Likewise, Essentials R2 has configuration options available to deliver an optimised experience in a hosted deployment, where features such as client backup and storage spaces are turned off by default

These configuration settings are designed to help customers deploy on both Microsoft’s Windows Azure cloud, but also in private clouds. Essentials R2 is also designed to work with the recently announced Windows Azure Pack, for example.

The Dashboard in Essentials R2 has also been enhanced with a services integration status page that shows the services that are available, and whether the customer’s server is attached to these services.

New services include support for Windows Azure Active Directory, providing SMEs with single sign-on and user authentication integration between their local Active Directory and cloud services such as Office 365.

Essentials R2 also integrates with Microsoft’s Windows Intune cloud-based management service, enables an administrator to manage users, security groups, and licenses for Windows Intune services from within the Dashboard. Microsoft is also making it possible for larger firms to deploy Essentials R2 as a server role within their data centre.

“Many of the key features that have been in the Windows Server Essentials product to date have not been available in [other] Windows Server editions. We have consistently heard from many of our customers in larger organisations that they too want the value that Windows Server Essentials offers,” said Anderson.

Microsoft stresses that Windows Server 2012 R2 Essentials is still under development, and that this is just a preview release. Like the existing version, it is limited to 25 users, with minimum system requirements a 1.4GHz 64-bit CPU with 2GB memory and 160GB hard drive.

Posted in Cloud Hosting

Oracle talks up cloud computing abilities of Database 12c

Posted on March 23, 2014 at 8:50 pm

Oracle has released an update to its flagship Database offering, which the company hopes will help the platform become a cloud computing staple.

The company said that the Database 12c release would introduce a multi-tenant architecture which will be key for transitioning the platform to hosted services.

The introduction of multi-tenancy is designed to let administrators better manage multiple database instances on a single server, effectively allowing service providers to offer hosted database services to small and medium-sized businesses while large enterprises will be able to consolidate their data centres with private cloud deployments.

“It is a really significant release,” said Duncan Harvey, Oracle EMEA director of business development technology. “It is a much more cloud-friendly and modern architecture, which is going to allow a lot more consolidation.”

Cloud computing has been a hot topic for Oracle in recent weeks. The company has signed a pair of high-profile deals, including an agreement with Microsoft to offer its platforms on the Azure cloud service and the renewal of a deal with Salesforce.com to power the company’s CRM service with Database.

In making the cloud deals, Oracle will also settle what have at times been strained relationships with its rivals that included public feuds and critical remarks towards executives.

While the multi-tenant architecture is the major selling point for Database 12c, Harvey told V3 that the update will include a number of additional features. He noted that the security tools in 12c have been updated to better protect user data and prevent the loss of vital information, while tools such as Automake can help to conserve storage space.

“The actual Oracle Database itself can look at data usage,” he explained. “It can look at what is hot and what is cold at various levels and, based in that and policies, move the data into multiple tiers of storage.”

 

Posted in Cloud Hosting

Wait for iOS 7 helps push down global IT spend forecasts by $100bn

Posted on March 21, 2014 at 11:53 am

Worldwide IT spending forecasts for 2013 have been lowered by $100bn to $3.7 trillion, after analyst house Gartner revised its figures for the year.

The firm said the revised figure was due to several factors in the market including a continued slump in device spending and fluctuating currencies. Despite this, the figure of $3.7 trillion is still up on the worldwide IT spend of $3.6 trillion in 2012.

The slow down on devices is seen in several areas, including traditional desktop computers and laptops, with sales of devices set to rise by just 2.8 percent in 2013 to just shy of $700bn. This contrasts with growth of 10.9 percent in 2012, when sales of devices hit $676bn.

Gartner analyst Richard Gordon told V3 this is due to a number of factors, such as a lack of must-have products this year, as well as the price of alternative products and consumer indecision.

“2012 was very strong for devices overall, but this was helped by the premium smartphone market,” he said. “Now, people are holding on to their smartphones longer; waiting for new products and operating system upgrades [such as iOS7] which essentially give them a new phone with the same hardware. In the tablet space new purchasers are going for low-end tablets because they’re pretty good and do the job.”

However, better growth is expected in the data centre market as cloud computing and big data push up spending, with $140bn now expected to be spent on servers this year. Gordon said this is down to an ever-growing appetite for data storage and bigger projects.

“The storage space has been relatively recession proof as the amount of data that needs to be stored is ever increasing,” he said. “There is a trend towards hyper-scale data centres; work can be more efficiently managed in those data centres. Although we’re seeing growth, it’s modest.”

There was also good news for the IT services sector: projected spending is almost at $1 trillion for 2014, as a result of many more complex systems being employed by businesses. In addition, the increase in spending on IT services represents a recovering economy, according to Gordon.

Telecoms still sees the highest spending overall, with $1.6 trillion to be spent on communications services this year. In 2012, the market shrunk slightly, but in 2013 the sector will grow again, which is in some part thanks to the rapid rise of mobile data.

“Fixed voice and mobile voice is declining and is shifting towards mobile data because of the popularity of mobile devices and mobile apps requiring a heavy amount of data use. In 2012 mobile data was about 23 percent of overall spending, by 2017, it will be 36 percent,” added Gordon.

Posted in Cloud Hosting

Oracle talks up cloud abilities of Database 12c

Posted on March 19, 2014 at 1:28 pm

Oracle has released an update to its flagship Database offering which the company hopes will help the platform become a cloud computing staple.

The company said that the Database 12c release would introduce a multi-tennant architecture which will be key for transitioning the platform to hosted services.

In introduction of multi-tenancy will allow administrators to better manage multiple Database instances on a single server, effectively allowing service providers to offer hosted database services to small and medium-sized businesses while large enterprises will be able to consolidate their datacentres with private cloud deployments.

“It is a really significant release,” said Duncan Harvey, Oracle EMEA director of business development technology.

“It is a much more cloud-friendly and modern architecture, which is going to allow a lot more consolidation.”

Cloud computing has been a hot topic for Oracle in recent weeks. The company has signed a pair of high-profile deals, including an agreement with Microsoft to offer its platforms on the Azure cloud service and the renewal of a deal with Salesforce.com to to power the company’s CRM service with Database.

In making the cloud deals, Oracle will also settle what have at times been strained relationships with its rivals that had included public feuds and critical remarks towards executives.

While the multi-tennant architecture is the major selling point for Database 12c, Harvey told V3 that the update will include a number of additional features. He noted that the security tools in 12c have been update to better protect user data and prevent the loss of vital information, , while tools such as Automake can help to conserve storage space.

“The actual Oracle Database itself can look at data usage,” he explained.

“It can look at what is hot and what is cold at various levels and, based in that and policies, move the data into multiple tiers of storage.”

 

Posted in Cloud Hosting

Rackspace joins Cern openlab to develop federated clouds for LHC research

Posted on March 17, 2014 at 12:40 pm

European physics laboratory Cern is working with Rackspace on a project to link together multiple clouds used by the site along with other research centres and public cloud computing resources, in order to provide scientists with the compute power to research data from the Large Hadron Collider (LHC).

Unveiled today, the partnership sees Rackspace join the Cern openlab public/private collaboration scheme as part of a research project into linking together multiple clouds. The move is expected to provide Cern scientists with huge amounts of compute power to research results from the LHC, while Rackspace hopes to gain valuable experience and insight into best practices around cloud interoperability and managing large-scale cloud infrastructure.

Tim Bell, infrastructure manager at Cern, told V3 that the simulations that scientists want to run sometimes exceed the capacity available at the centre, and so the project will seek ways of expanding compute capacity by linking to clouds at other research facilities and public Openstack clouds.

“We get around 35 petabytes (PB) a year from the LHC when it’s running, and this data is analysed and recorded using the set of machines hosted at Cern, plus an additional datacentre we’re just setting up at Budapest. We’re in the process now of converting what was a set of physical servers into a large-scale Openstack-based cloud,” Bell explained.

“We expect by the time the [LHC] starts up operations again in 2015 to have around 15,000 servers across those two datacentres to handle the data recording. On top of that, the scientists need to be able to simulate collisions, to visualise what the theory would predict and then analyse the results of the data to compare what real life is looking like against the theory.

“Typically, we would use resources that we have here at Cern, but if there is a conference coming up, we need to run additional programs to analyse the latest data and the workload often exceeds the capacity that we have available. That’s why it is interesting to be looking at being able to take advantage of public cloud resources without having to permanently enlarge the datacentre here,” he said.

Nigel Beighton, vice president of technology at Rackspace, told V3 that his firm is part funding the project, which is expected to pay dividends in new standards for linking clouds together.

“Cern is one of world’s largest producers of data. They need large banks of compute power to deal with the amount of data they’ve got. We’re going to be looking at the best technological solutions to do that across multiple clouds,” he explained.

“The best outcome at the end of the day will be if two things come out of this: one is the technology to allow people to connect their clouds together, and second, given Cern’s heritage in open standards, there emerges a set of open standards for broader interoperability of clouds,” Beighton said.

Cern is now operating three Openstack clouds, according to Bell. In addition to the one operated by the site’s IT department, there are two large server farms associated with the CMS and ATLAS experiments, normally used to filter the 1PB of data per second that spews out of the detectors down to a reasonable volume that can be recorded.

While the accelerator is being upgraded over the next 18 months, these two server farms, comprising about 1,300 servers in total, are being spun up with an Openstack cloud each to provide extra resources.

“There are also other locations around the globe affiliated with Cern and running Openstack, such as BrookHaven National Laboratory and Nectar Labs in Australia, so it gets very interesting to look at how these resources can be connected and do better sharing. It is both a private to private and private to public federation that we are envisaging” Bell said.

Cern is also looking into using tools such as Hadoop in order to make use of the techniques that others are using for large-scale data analytics, according to Bell.

Meanwhile, Cern also has some impressive IT infrastructure just to control and operate the LHC, as V3 reported last year.

Posted in Cloud Hosting

Benioff and Ellison promise to play nice in Oracle-Salesforce tie-up

Posted on March 15, 2014 at 6:34 pm

Oracle chief executive Larry Ellison and Salesforce.com chief executive Marc Benioff vowed to work together as the two firms kicked off a new partnership deal.

The long-time associates and rivals discussed their recently-announced collaboration effort with reporters and said that the two companies would not only integrate their products, but also potentially collaborate on additional efforts.

The early phases of the project will see Salesforce double down on its efforts to build its back-end infrastructure on Oracle Database. Benioff said that the Oracle platform, which has powered Salesforce.com since the company first launched, will continue to provide the basis for its CRM and platform as a service (PaaS) offerings.

“The Oracle Database has been a key part of Salesforce.com infrastructure from the beginning of our company,” he said.

“Now that oracle has focused on the cloud, we are delighted to commit to another 12 years.”

Relations between the two companies have not always been so rosy. Driven by an escalating competition in the enterprise computing space, Ellison and Benioff have famously butted heads on several occasions, including a heated confrontation at the 2011 OpenWorld conference in which both CEOs publicly criticized one another.

To help make piece, Benioff extended an open invitation to Ellison to attend the Salesforce.com Dreamforce conference this Fall, a gesture Ellison was quick to accept.

“We have always enjoyed working together and having fun,” Benioff said of the relationship with his former boss.

“Hopefully it will be the end of us getting too revved up at times.”

While neither CEO would provide details on future planned projects, Ellison suggested that Oracle could look to further integrate Java with the Force.com platform.

“This is an area where Salesforce.com and Oracle can explore,” he said, “and if it makes sense we will have another announcement and make all the developers happy.”

Posted in Cloud Hosting

« Previous PageNext Page »