Posted on April 7, 2013 at 6:17 pm
Salesforce has launched a mobile edition of its Service Cloud application with features including chat, co-browsing, communities and touch interfaces for both iOS and Android devices, including the Kindle.
Service Cloud Mobile will help businesses improve the mobile services they offer customers, by giving businesses the ability to interact with their customers in real time, giving answers to customer queries instantly.
“We are doubling down on mobile this year,” Salesforce innovation director Charlie Richey told V3. “This announcement is the launch pad [for] a number of mobile announcements that will follow.
“This isn’t just saying Service Cloud is mobile as we have had mobile capabilities on the Service Cloud before. This is about customer satisfaction and improving customer experiences.”
“Everyone looks first to receive customer service on the mobile devices they carry around with them. We believe they should not have to be on hold, and should not have to look for answers on social networks.”
The co-browsing technology available with Service Cloud Mobile will allow customer service agents to deliver guided assistance to customers on mobile devices via any web browser. Customers may need such assistance when performing complex transactions or trying to set up accounts or resolve issues.
Meanwhile, Service Cloud mobile chat will allow customers to instantly interact with live service agents to quickly resolve issues as they happen, said Salesforce.
A communities feature will allow businesses to provide a single destination for customers needing answers to questions via peers or company experts.
Additionally Salesforce said Service Cloud Touch will allow service agents to easily manage and resolve customer cases on the go with an Amazon Kindle, Android device, iPad or iPhone.
The cloud giant said the release builds on the success of Salesforce Touch, which brings Salesforce Sales Cloud to mobile devices, and the Salesforce Touch Platform, which allows developers to write custom mobile applications once and then deploy them to any device.
One customer that has used Service Cloud Mobile in beta is New Jersey’s public transportation organisation NJ Transit. It said it is now able to give travellers real-time information about their journeys.
However, Richey told V3 that Salesforce had no UK customers using the Mobile Service Cloud in beta that he could reference.
Posted in Cloud Hosting
Posted on April 5, 2013 at 4:37 pm
Samsung has unveiled a new service called Knox to let enterprises create and manage a secure container for corporate applications and data on their Android smartphones.
The handset maker last year introduced Samsung for Enterprise (Safe) on its Galaxy S3 smartphone, adding capabilities such as encryption and VPN support in a bid to make its devices more appealing to the corporate market.
Now, Samsung is extending this with the Knox technology in a move to meet the growing bring your own device trend by creating separate container environments for personal and professional use of devices.
The firm said this should remove many of the headaches facing firms dealing with issues of security and data protection. The Knox service will be available on select devices later this year.
Security management vendor Centrify has partnered with Samsung to provide administrator control for the feature via a company’s existing Active Directory infrastructure.
The capability is said to be comparable to the Balance feature on BlackBerry’s new Z10 and Q10 smartphones, potentially making Samsung a rival for BlackBerry in the corporate market.
“It involves making their devices more secure and enterprise-ready, providing cleaner separation between work and play, and Samsung definitely sees an opportunity to go after the enterprise mantle that BlackBerry has historically had,” Centrify chief executive Tom Kemp told V3.
The technology revolves around creating an isolated container on the device, according to Kemp, which can be managed by the IT department, while the user is free to use the handset outside the container for whatever they wish.
“One of the issues with bring your own device (BYOD) in the enterprise is that there’s a big concern about data leakage occurring, and making sure the enterprise stuff stays in the sandbox and doesn’t leak out,” Kemp said.
“If the employee leaves the organisation, IT want to ensure the information is wiped, but the user doesn’t want their personal photos and music deleted as well.”
Posted in Cloud Hosting
Posted on April 3, 2013 at 3:46 pm
Google has unveiled a new Chromebook device called the Pixel that is designed to compete with Apple and Microsoft at the top end of the laptop market with a cool £1,049 UK price tag.
The device was unveiled late on Thursday and V3 was one of a handful of UK sites to get its hands on the device. Below are a series of images showing the key dimensions and specifications of the devices.
The device boasts a 3:2 display which Google said is designed to better display pages on the web which are generally designed vertically, rather than adhering to movie formats of 16×9 that are more horizontally framed.
Google said it did away with the markings usually found for ports for design reasons and because most users never actually consult what they are, instead just working out the shapes of what will fit into which holes. This makes sense in a way; most people don’t need these symbols and many other devices don’t include them either.
Google made big boasts about the screen, claiming it was better than anything on the market with a pixel density of 229 pixels per inch (PPI). Here you can see it snapped against a large-screen Macbook Pro, and certainly there’s not much difference between the two, as you can see below (although admittedly, this is far from ideal testing conditions, and we’ll compare it more properly in due course).
When compared to other Windows devices on the market such as a Lenovo X220, you can see the device is not that much larger in general, but far more of its real estate is given to the screen or keyboard units, which is a nice touch as these are far more important than the casing around it.
Speaking of the keyboard, it’s certainly very nice, with deft, responsive keys that are well spaced out and easy to adapt too.
Overall, based on early first impressions, while the price is fairly eye-watering, the Pixel is a lovely piece of design and the touchscreen is very nice to use too. Those unsold by Apple or Microsoft and looking for something at the high-end and confident they’ll always have Wi-Fi access may well be tempted, but the price may well be off-putting for many.
Check back in the coming days for a more thorough review.
21 Feb 2013
Posted in Cloud Hosting
Posted on April 1, 2013 at 9:33 am
The VCE datacentre joint venture formed by Cisco and EMC has unveiled the next wave of its converged infrastructure products, introducing lower-priced mid-range Vblock integrated systems, enhanced management tools and specialised systems for specific applications starting with SAP HANA.
VCE announced the Vblock System 100 for branch office deployments and the Vblock System 200 for mid-sized customers, positioning both of these below the existing Vblock System 300 and 700 series of modules.
Meanwhile, the latter lines have been enhanced with the addition of the Vblock System 320 and 720, which offer three times the performance and three times the scalability of the existing hardware, according to Todd Pavone, VCE’s vice president of product development and strategy.
“Previously, with our enterprise customers, we were selling into the core, but now with the Vblock 100, you can go to the edge, to those remote office locations, while with Vblock 200,we can say to mid-size firms that you can take advantage of the benefits the large enterprise customers been getting,” he said.
VCE’s raison d’être is to offer converged infrastructure to enterprise and cloud computing service providers, combining Cisco network switches and UCS blade servers with EMC VNX storage in pre-built and pre-tested configurations, delivered in racks as modular Vblocks.
Pavone claimed that figures from IDC demonstrate that this architecture delivers five times faster deployment, 83 times better availability and is cost three times less than existing silioed infrastructure.
VCE also introduced Vblock Specialised Systems, versions of the infrastructure optimised for key enterprise applications, with the SAP HANA the first to be delivered.
“We take the app and integrate it for you, certified by SAP, so it is uniquely optimised to architect that workload into Vblock,” said Pavone.
Specialised Systems integrated with other enterprise applications are planned over the next 12 to 18 months, he added.
To simplify management, VCE announced Vision intelligent operations, a single management interface for managing the entire portfolio of Vblock systems and their components, delivering an administrator dashboard for VCE infrastructure.
Vision integrates fully with VMware’s vCentre and vCentre Operations Management Suite, Pavone said plus VCE has an open API and SDK to enable integration with other management platforms, if required.
“We believe this is announcement today represents the foundation for the future, for that end-state datacentre that is predicitive, intelligent and adaptive,” Pavone said.
Vblock System 100 is due to ship in March via VCE and its partners, while Vblock System 200, VCE Vision Intelligent Operations and Vblock Specialised System SAP HANA are all expected early in the second quarter.
Posted in Cloud Hosting
Posted on March 30, 2013 at 9:18 am
VMware’s Horizon Suite, announced this week, is the culmination of multiple projects the firm has had bubbling away in the background for several years, and extends beyond its traditional stronghold in the corporate datacentre to address client-side issues, providing access to applications and data from a broad variety of platforms.
In fact, when you look at the breadth of capabilities baked into the suite as a whole, Horizon can be seen as an ambitious attempt by VMware to position itself as an all-encompassing enterprise computing provider, with the potential for a monumental lock-in for users.
VMware is offering organisations the opportunity to not only consolidate all of their servers and back-end infrastructure into a private cloud, but to then use that platform to deliver end-user computing as a set of centralised services out to mobile devices and Windows PCs alike.
The suite’s three components – View, Mirage and Workspace – provide virtual desktop, physical PC management and access to applications and data, respectively. Although available separately, the three work together to provide a wide-ranging feature set for handling end-user computing.
Horizon Workspace is the newest and possibly most significant piece, as it integrates the Dropbox-like Project Octopus technology for shared documents with Horizon Application Manager, which unifies management of Windows and SaaS applications into a central catalogue from which users can access them.
On non-Windows devices, access to Windows apps can be delivered by remotely accessing a virtual desktop, provided by Horizon View. This works even on devices without a View client, by delivering the virtual desktop inside a browser.
For firms that bought into VMware’s platform some time ago, this must seem like a tempting proposition. For a relatively modest additional cost, they can utilise the infrastructure which they have already invested in to help them solve problems such as bring your own device (BYOD), ensuring that users can have access to the applications they need from anywhere, and securely sharing company documents.
However, that is also where the drawback lies. For all the attractiveness of being able to deliver end-user computing as a centrally provisioned service, you need VMware’s underlying vSphere platform in order to support this, and that doesn’t come cheap.
Posted in Cloud Hosting
Posted on March 28, 2013 at 9:22 am
Western Europe is set for a data explosion that will make the region among the top content consumers on the planet, according to a report from IDC.
The research firm said that by 2020, users in Western Europe will consume some 5.5 zettabytes of data annually. The figure would represent an annual growth rate of 30 percent and would put the market among the highest consuming regions in the world.
The study, which was sponsored by storage firm EMC, noted that enterprises look to increasingly bear responsibility for the data explosion. Researchers estimate that in 2012 at least 80 per cent of all data generated was the direct or indirect responsibility of enterprises as either contact with other firms or direct interaction with customers.
Researchers also noted that Europe has several unique traits which separate it from other regions. Traffic from infrastructure such as video surveillance networks were far higher than the US or Asia, while individual TV consumption is higher than other regions, particularly as carriers migrate to digital TV broadcasts.
Going forward, the firm said that it believes emerging fields such as big data and cloud computing will take over as key drivers for data volume growth. As such, IDC said that new challenges will like arise for enterprises and storage vendors alike.
“In just five years, the Western Europe share of the digital universe will be about the same size as the entire digital universe in 2012. Its share of the digital universe will be many times more valuable than today, but also many times more volatile,” the company said in its report.
“Many times more bytes will need information security, many more systems will need real-time responses, and many more demands for reliability and speedy access will be made of the IT managers, CIOs, data scientists, and chief security officers that manage the actual digital universe.”
Posted in Cloud Hosting
Posted on March 26, 2013 at 10:23 am
VMware is looking to build upon its datacentre presence with the launch of Horizon Suite.
This is a set of technologies that provide access to applications and data from any endpoint device, helping enterprises to deal with the growing BYOD trend while also encompassing existing investments in Windows PCs.
Set to be available by the end of March, Horizon Suite finally delivers on the end-user computing technologies that VMware has been developing over the past few years, enabling firms with VMware’s vSphere infrastructure to have a centralised, universal platform for delivering applications – including a Windows desktop experience – to any user on any device.
Many aspects of the Horizon technology were previewed at the firm’s VMworld conference last year.
“The one-size-fits-all view of end-user computing has just not been able to keep up with what’s happening in the industry,” VMware’s chief market technologist for end-user computing, Brian Gammage, told V3.
“Our approach with Horizon is to turn assets at the back-end into services which can be accessed via a central point, where you can apply policy, so we can deliver the services to any device and any user,” he added.
Horizon Suite comprises three components which can be used and licensed separately or together as a single suite. Two of these – Horizon View 5.2 and Horizon Mirage 4.0 – are updates to VMware’s existing tools for virtual desktop infrastructure and for managing images with physical PC fleets, respectively.
The third piece is Horizon Workspace, a new product that serves as a user portal for access to document storage, applications and virtual desktops, all via a browser and offering single sign-on for access to all resources.
Key for the enterprise is that Horizon is aware of what kind of endpoint a user is using, and can serve up an appropriate experience for that device’s capabilities, whether it is a PC, a Mac, an iPad or an Android smartphone.
For example, Horizon View now incorporates VMware’s AppShift technology, which touch-enables a remote Windows desktop when accessed on a tablet device.
Likewise, View now includes AppBlast, which serves up a Windows desktop using HTML5, so that a user can access a virtual desktop session from any device that has a web browser, removing the need to install a remote desktop client.
Posted in Cloud Hosting
Posted on March 24, 2013 at 10:10 am
Japanese telecoms giant NTT Communications has announced the global availability of its Enterprise Cloud, a virtualised infrastructure as a service (IaaS) offering with datacentres now in the US, the UK and Asia Pacific.
NTT Communications first launched its software-defined networking (SDN)-based Enterprise Cloud offering via datacentres in Japan and Hong Kong in June 2012. The additional datacentres will now allow customers worldwide to take advantage of the offering.
Len Padilla, NTT Europe senior technology director, told V3 that the Enterprise Cloud will be delivered as a public cloud, rather than as an offering that will allow customers to own a private cloud in NTT Communications’s datacentres.
“[The Enterprise Cloud] is a shared platform, so we are not deploying individual devices per customer, but they can configure the available network. So they can segment the network to get the security and availability they want with a lot of flexibility as well,” Padilla told V3.
“Customers, instead of reconfiguring machines on a flat network, will be able to create a multi-tier network, creating eight or nine individual segments. Then they will be able to insert firewalls in-between those segments and load balance over those segments.”
An SDN-driven portal will give customers a single dashboard to obtain real-time resource use and to configure virtual machines, firewalls and load balancers.
The NTT datacentre in the UK is based in Hemel Hampstead, North London. In addition to five datacentres in Asia, the firm has also deployed a datacentre on the US East Coast and another on the US West Coast, as well as one in Sydney, Australia. Customers will be able to specify where their data is to be held.
“The issue of data sovereignty is really important to us. Customers should be able to put data into the cloud and specify where it is. We know for many customers it’s also a regulatory compliance issue,” said Padilla.
Posted in Cloud Hosting
Posted on March 22, 2013 at 7:50 pm
A year has now passed since the government launched its G-Cloud programme to try and drive the uptake of cloud computing in the public sector. But is there a reason to celebrate the anniversary of this programme
The G-Cloud system allows the public sector to rent the use of services as needed and do away with lengthy contracts. The system also allows SMBs to sell to government departments alongside larger enterprises.
While the Cabinet Office has been keenly marketing the programme in the public sector, publically celebrating each G-cloud contract signed and holding regular BuyCamp events extoling the benefits of using cloud services, there remain challenges to the programme’s adoption among civil servants and local government workers.
The latest iteration of the framework, launched a month ago in January, offers the public sector a choice of 3,200 services from 459 suppliers, three-quarters of which are small and mid-size businesses. According to the government, G-Cloud suppliers have now made £6m from the programme since its launch, with over 60 percent of this going to SMBs.
G-Cloud programme director Denise McDonagh said there is reason to celebrate the one-year anniversary, although she admitted experiencing challenges in shifting the culture of the public sector.
“After only a year most big government departments have bought services from the Cloud, and there is significant buy-in from local government. Evidence of the benefits of cloud is growing all the time, and we are working with buyers to help them adapt to commodity-based IT purchasing,” she said.
However according to IT advisory partner at Ernst & Young, Graeme Swan, the government’s G-Cloud programme team will need outside help when trying to shift deep-rooted public sector procurement attitudes that have a tendency to favour larger, more traditional and on-premise suppliers.
“My general view is the Cabinet Office [which leads the G-Cloud programme] have done a good job, showing good leadership and good PR. They have built a great website and well done to them,” he said in an interview with V3.
“The problem is that no one is using the G-Cloud. Although the Cabinet Office says it’s the programme’s first year and it needs time to gather momentum, I’m just not sure this is the case.
“The problem is that firstly many government departments don’t understand how to buy these services and secondly they don’t know how to integrate them with the rest of their IT infrastructure.”
Swan said a large number of government departments lack intelligent and competent IT buyers as well as the necessary integration skills. Also many departments tend to lack a service-oriented-architecture (SOA) IT set-up that would allow them to plug and play cloud services easily.
Posted in Cloud Hosting
Posted on March 20, 2013 at 2:38 pm
Sunnyvale: The changing role of IT is forcing departments to rethink the way they interact with the rest of the business, according to the chief information officer of NetApp.
CIO Cynthia Stoddard told reporters that lessons learned through the company’s own IT operations have caused the firm to take a closer look at the way its own customers can leverage IT staff and bring technology administrators better in line with the business side.
“I have found that transparency around infrastructure costs really helps,” she said.
“The business side has felt that IT is trying to hide something. If you open up and put some of that on the table it becomes more of the business.”
The result can be support for new initiatives such as hybrid cloud deployments or the addition of support for web services to help supplement on-premise technologies and services.
Stoddard described such a process within NetApp’s own walls. The company has recently adopted a “Net cloud” campaign which gathers information on the various cloud services and instances being used by employees and attempts to migrate them over to a private hosted cloud which allows for improved security and managability.
Big data has also played a role in helping to shape NetApp’s newest platforms. Stoddard said that the company has bolstered its efforts to build and implement big data platforms which collect and analyse information on support queries and incident reports.
The efforts have helped shape some of NetApp product features, such as remote support and “phone home” diagnostic technologies.
Posted in Cloud Hosting
« Previous Page — Next Page »