Posted on October 31, 2020 at 1:13 pm
In order to have a website it is necessary to have a hosting account. This could be a single dedicated server or more commonly one that is shared with other website accounts. Cloud hosting on the other hand is when data is spread across a number of different servers in different places which are all connected together. There is a virtual server that accesses this data, but it only exists in the virtual environment or cloud as it is known. So, what are the advantages of using this type of hosting for a website?
One of the chief benefits seen by businesses is the improved security that cloud hosting provides. It is far easier to protect a cloud network from cyber threats as the data is held in a central location rather than a dedicated server. Another benefit is that the data can easily be recovered should there be a failure in any devices accessing the cloud.
The flexibility of cloud hosting is seen as a definite advantage to a new business. As the business grows and possibly increases the number of employees, the demand for capacity rises so cloud hosting allows for the business needs fluctuating over time without additional costs to the business.
Posted in Cloud Hosting
Posted on January 27, 2016 at 10:14 am
Cloud hosting is undoubtedly a convenient choice for many companies. It allows for low-cost, low-maintenance storage which is easy to access. However, is it secure?
There are lots of differing opinions about whether or not cloud storage is a secure option to choose, so it can be very difficult for businesses to make an informed decision. Some people have expressed concerns about the cloud being a multi-tenant storage method, as numerous people will be remotely accessing the server. There are also some worries that sensitive data might not be properly deleted and that cloud storage is more permanent than a physical server – will your data always exist somewhere?
It is in fact completely possible to wipe any data which you want to, in exactly the same way as you would on a physical network. It’s also possible to ensure that other people can’t access your data by isolating environments on a multi-tenant cloud. You can isolate your VMs on its own network, ensuring that nobody else can see your data.
It is possible for cloud hosting to be safe and secure, but make sure that you choose a company that you can trust. It’s possible for people to abuse the system, so you need to feel comfortable with your selection. Read about potential companies as much as you can, and look at plenty of reviews.
Posted in Cloud Hosting
Posted on June 17, 2014 at 8:39 pm
VMware is advancing its vision to build the software defined data centre with several key enhancements to its cloud and virtualisation stack, including an update of vSphere, the release of its NSX network virtualisation technology and a beta of VMware Virtual SAN, a platform for virtualising storage in the datacentre.
Announced at VMware’s VMworld event in San Francisco, the updates are intended to help customers move a step closer towards the goal of an entirely automated and virtualised data centre capable of delivering services on demand and adapting dynamically to changing requirements.
While VMware has built its reputation on its vSphere platform for virtualising server compute resources, the firm is now seeking to embrace networking and storage. This is necessary in order to bring the entire data centre under automated control, according to VMware’s senior product marketing manager in EMEA, Rory Choudhuri.
“The whole of your IT needs to become virtualised. In order to achieve the response times required for business-critical applications and services, it has to be completely automated. You have to take the human out of the loop,” he said.
To this end, the firm is releasing VMware NSX, which delivers the entire Layer 2 to Layer 7 networking and security model in software. The platform unites VMware’s vCloud Network and Security (vCNS) with the network virtualisation technology it gained via the acquisition of Nicira last year.
NSX, which was first detailed by VMware earlier this year, treats the physical network as a pool of transport capacity that can be carved up by creating virtual networks as required, in a similar way that vSphere pools and allocates server resources.
However, VMware said it has learned lessons since the introduction of its hypervisor, and has been working closely with the networking industry to ensure that vendors are aware of what it is doing and are on board. Dell is preparing to ship a new line of switches compatible with VMware NSX, for example.
Meanwhile, VMware Virtual SAN (VSAN) is being made available as a public beta later this quarter. It is designed to take existing storage resources in the datacentre, such as SAN and NAS arrays and direct attached storage (DAS), pool this together, then present it back to the system.
VSAN implements a policy-driven control plane that automates storage consumption and management, and is being touted as especially useful for customers implementing virtual desktops (VDI) or Hadoop deployments.
Choudhuri said that VSAN has been operating successfully in private beta deployments for about six months and is “basically ready to go”, but that VMware is taking the cautious approach of pushing it out for public beta tests before a full commercial release.
VMware also announced vSphere 5.5, which adds performance and scalability enhancements, plus support for operating Hadoop deployments. The Hadoop support comes in vSphere Big Data Extensions, while vSphere 5.5 also enables configurations with twice the previous limits on physical CPU and memory.
Also extended in vSphere 5.5 is VMware’s high availability (HA) support. With vSphere App HA, this can detect failure of apps as well as virtual machines, and can recover from this automatically.
VMware also announced that its vCloud Automation Center and vCenter Operations Management Suite capabilities have now been added to all editions of its vCloud Suite 5.5. Previously, only the Enterprise and Advanced editions had Operations Management Suite, while vCloud Automation Center was only in the Enterprise edition.
Posted in Cloud Hosting
Posted on June 15, 2014 at 8:15 am
Traffic management firm Gigamon has unveiled an update for its GigaVue service that offers IT teams the ability to gain insights on the different traffic demands from different areas of the business.
The GigaVue 3.1 update will include a Visibility as a Service (VaaS) add-on within the Flow Mapping process to enable administrators to supply data on the traffic within departments.
This could be used by teams such as marketing to analyse visitor traffic, or security teams looking at event monitoring after an incident.
Gigamon chief strategy officer Shehzad Merchant said that providing this kind of system will help enterprises benefit from the cloud in internally to gain greater insights into their data.
“The notion of multi-tenancy has made its way from the public cloud space into enterprise IT infrastructure as well,” he said.
“This solution enables network administrators and services teams to virtualise the Visibility Fabric and offer Visibility as a Service to the different IT departments.”
The firm said that this capability will enable organisations to alter management policies on a per-team and per-department basis as needs requires, while maintaining the compliance and privacy controls they have in place across the enterprise.
These tenants, who include various IT operations teams, will have the power to dynamically change monitoring and traffic visibility policies on a per-organisation or per-tenant basis without impacting other departmental monitoring polices, while maintaining compliance and privacy.
The GigaVue 3.1 update also includes a host of other updates such as role-based access control capabilities and improved workflow displays for monitoring of policy configurations. The update will be launched on 30 September for no additional cost for existing customers of the GigaVue tool.
Posted in Cloud Hosting
Posted on June 13, 2014 at 7:16 pm
Technology and outsourcing services firm Capgemini has unveiled a new service for firms to help better manage ever-growing amounts of data, by utilising Amazon Web Services cloud computing platform.
Capgemini claims the Elastic Analytics services will offer customers an end-to-end and big data analytics solution and will support most leading business intelligence (BI) software packages.
The Elastic Analytics services works by combining large source sets of structured and unstructured data. It does this using existing extract, transform, load technologies and the AWS Hadoop-based solution, Amazon Elastic Map Reduce (EMR). Once collected it then extracts and merges the data into analytics engines that can be used by businesses to study the data.
Capgemini senior vice president for business information management (BIM) Scott Schlesinger said that AWS’ adaptable nature makes Elastic Analytics one of the most flexible and cost effective big data solutions available.
“Organisations are continuously looking for optimized solutions that deliver shorter ‘time-to-value’ advanced analytics. AWS is a highly adaptable and extensible platform that rapidly offers organizations the ability to launch and sustain their advanced analytics initiatives,” he said.
Big data management is a growing problem facing businesses around the world, with many holding vast reserves of unstructured and often unprotected data.
Numerous companies have listed solving the big data problem as a key opportunity, with firms like SAP offering similar analytics tools through its HANA platform. The German software firm originally loaded its HANA analytics database platform as an enterprise cloud service into the cloud in May.
HP has also taken interest in the market, with chief executive Meg Whitman claiming traditional IT solutions are no longer sufficiently powerful for enterprise-level businesses, warning during a speech at HP Discover earlier this year that companies will need to move their systems into the cloud if they hope to compete in the new landscape.
Posted in Cloud Hosting
Posted on June 11, 2014 at 6:58 pm
Cloud storage firm Box has announced a pricing strategy overhaul in order to entice previously elusive SMB customers to its services.
In addition to doubling the amount of storage space available to its free customers, the company has also created a new Starter price plan specifically designed for small businesses. Costing £3.50 per user per month, Starter gives teams of up to 10 members 100GB of space with a maximum individual file size of 2GB.
Box chief operating officer Dan Levin told V3 that this new pricing strategy was a response to the realisation that the firm’s £11 per user per month Business plan was not suitable for smaller enterprises. He said that often smaller businesses instead found themselves using the free version of Box to run their businesses, which meant they could not administer additional users or properly manage their data.
“That was not the the right strategy to get those SMBs,” Levin explained. “These technologies, which are used by some of the largest companies in the world, are now accessible to SMBs in a way they haven’t previously been.”
The doubling of the storage space provided to non-paying users was an important addition, according to Levin. “In order to continue to add value to our personal users, we’re doubling the amount of storage from five to 10GB. We’re making sure that businesses that use the personal product have the right service; it’s a very important part of our business model,” he said.
As well as the new pricing plans, Box has also enhanced its Business option, adding integration for one enterprise application such as Salesforce or Active Directory. Previously this integration was only available on the more expensive Enterprise price plan, which now costs £25 per user per month and still provides unlimited application integration.
Posted in Cloud Hosting
Posted on June 9, 2014 at 6:21 pm
Cloud and hosting firm Rackspace has announced a new service aimed at getting VMware customers into using hosted infrastructure as a stepping stone to future cloud adoption.
The Dedicated VMware vCenter Server is part of Rackspace’s Managed Virtualisation service. Unlike a public cloud, it will provide managed single-tenant vCenter Servers running inside Rackspace data centres, designed to give enterprise customers the confidence to migrate existing VMware workloads outside of their own premises.
According to Rackspace, the Dedicated VMware vCenter Server environment will look and feel like an extension of the customer’s own data centre. It will enable customers to fully manage servers via the VMware vCenter APIs or equivalent third-party management tools, while providing visibility into costs and usage, with Rackspace providing support for the physical infrastructure.
Under the current Managed Virtualisation service, Rackspace exposes some features of vCenter through its MyRackspace customer portal, but not the full set of management capabilities.
Rackspace chief technology officer John Engates said: “This new service has been designed to enable customers to migrate workloads out of their data centre and into a Rackspace data centre. This allows Rackspace to do what we do best, which is providing a fully managed hybrid cloud hosting service backed by Fanatical Support with maximum uptime.”
The firm has been offering VMware-based hosted private cloud infrastructure since at least the start of 2011, but has since begun to shift its emphasis towards services based on the OpenStack cloud computing framwork that it co-founded.
Rackspace has also been strongly touting its vision of hybrid cloud computing, where private on-premise cloud infrastructure is supplemented by public cloud resources as required.
Today’s announcement can thus be seen as Rackspace attempting to drum up more hybrid cloud business by offering VMware users a stepping stone towards it. By offering dedicated servers linked to a customer’s on-premise infrastructure, the firm seems to be banking on users seeing the attraction of having someone else take care of managing the infrastructure.
At launch, licensing for for the Dedicated VMware vCenter Server will be a flat monthly fee per hypervisor, according to Rackspace, regardless of the number of VMs managed. However, the exact price has yet to be disclosed.
Posted in Cloud Hosting
Posted on June 7, 2014 at 8:26 pm
HP has unveiled a cloud service that enables organisations to securely share and collaborate on files with colleagues, customers and business partners, while maintaining visibility and governance over content.
Called Autonomy LinkSite, it effectively integrates HP’s on-premise Autonomy WorkSite document and email management system with HP Flow CM, a cloud-based file-sharing and collaboration service. The result, according to HP, is an enterprise-grade document and email management system with the ease of use and simplicity of a consumer solution.
HP said the tool is due for early release to testers from mid-September, and is scheduled for full commercial release on or near 15 October.
HP LinkSite is the latest in a series of product launches aimed at tackling the problem of sharing content easily, while allowing an organisation oversight and control over the information contained within.
The problem, according to HP, is that workers expect to be able to share content freely as they do with consumer-grade services such as Dropbox, and are liable to resort to these if their organisation does not provide a satisfactory alternative solution.
Autonomy LinkSite addresses this by extending the traditional workspace from Autonomy WorkSite into the cloud, enabling users to share a single file or an entire project folder with others both inside and outside the firewall.
However, content uploaded to the cloud this way inherits all security properties set in Autonomy WorkSite, according to HP. All actions taken on content in the cloud are also reported via the Autonomy WorkSite audit trail, extending enterprise security and governance to the cloud.
Files shared via Autonomy LinkSite are synchronised across all employee devices, and can be accessed through any web browser, HP said.
Neil Araujo, general manager of Enterprise Content Management at HP Autonomy, said that organisations no longer have to turn a blind eye to workers using consumer file sharing services.
“Businesses now have a very attractive alternative that satisfies the needs of the users as well as the IT and compliance teams,” he said.
Pricing for HP LinkSite is expected to start at $19.95 (around £13) per user per month, depending on the length of contract, HP said. For larger enterprise organisations with 1,000 users or more, licensing is likely to be as low as $9.95 (around £6.50) per user per month, also dependant on length of contract and features selected.
Posted in Cloud Hosting
Posted on June 5, 2014 at 1:48 pm
Google is beginning to encrypt all data uploaded to its Cloud Storage platform, in a bid to bolster its security credentials.
Google’s Cloud Storage is a data storage product for businesses, intended for static content such as web pages and other permanent files. Previously, users would have to create their own encryption keys and manage them personally, but with this update Google will do the legwork for its customers, handling the encryption keys and the encryption process.
Dave Barth, Cloud Storage product manager, detailed this change further on the Cloud Platform blog. He said: “If you require encryption for your data, this functionality frees you from the hassle and risk of managing your own encryption and decryption keys.
“We manage the cryptographic keys on your behalf using the same hardened key management systems that Google uses for our own encrypted data, including strict key access controls and auditing. Each Cloud Storage object’s data and metadata is encrypted under the 128-bit Advanced Encryption Standard (AES-128), and each encryption key is itself encrypted with a regularly rotated set of master keys.”
Barth added that if users wish to provide their own encryption, they are still free to do so. Currently, only new data written to the Cloud Platform will be encrypted by Google, which includes existing files that are overwritten. Barth said that older files left untouched will gradually undergo the encryption process in the coming months, while he also maintained that the service itself would not change in any visible way, either in terms of performance or functionality.
Encryption features were already available in other Google products including Persistent Disks and Scratch Disks, which are part of the Google Compute Engine cloud service. The encryption does not yet extend to Google’s consumer-facing cloud product, Google Drive.
Last week Google suffered a partial outage in various heavily used services including Search, Gmail, Drive and Talk. As a result, a 40 percent decline in overall web traffic was reported, highlighting the power that the search giant has over the world’s internet consumption.
Posted in Cloud Hosting
Posted on June 3, 2014 at 12:10 pm
Opscode has extended the capabilities of its Chef IT configuration and automation platform beyond just compute to cover networking and storage infrastructure in a new release called Enterprise Chef. The firm also announced it is working with Microsoft to better integrate Chef with the widely used Windows PowerShell tool.
Available immediately, Enterprise Chef builds on the existing capabilities of Chef for automating the provisioning and configuration of servers, based on reusable definitions called cookbooks and recipes that are written using the Ruby programming language.
Enterprise Chef is now able to automate the provisioning and management of compute, networking and storage resources, according to the firm, greatly extending its use for configuration management in the data centre, especially in the operation of both public and private cloud infrastructure.
Adam Jacob, Opscode co-founder and chief customer officer, said that businesses are in the midst of a major transformation in the way they operate their IT services, and need greater flexibility in deploying and managing infrastructure.
“Today we’re delivering an automation platform that accelerates this transformation by delivering on-demand IT services to achieve the speed necessary for meeting the new expectations of customers,” he said.
To help support this expansion of its capabilities, Opscode said it is collaborating with leading networking vendors to integrate Enterprise Chef into next-generation networking technologies, enabling it to automate networking port configuration and provisioning of bandwidth.
These vendors include Cisco, Juniper Networks, Arista Networks, Cumulus Networks and developer of software-defined networking tools Plexxi.
Meanwhile, Opscode is working with Microsoft to integrate Chef with Windows PowerShell, specifically the Desired State Configuration feature of the Windows Management Framework (WMF). This will provide administrators with new options to automate Windows resources in the data centre, the firm said.
While Chef was previously available in separate Private Chef and Hosted Chef versions, Enterprise Chef effectively replaces both with a single release that can be operated as on-premise software or as a hosted service.
Pricing starts at $6 (£3.80) per node for both deployment models, but Enterprise Chef is free to deploy for five nodes or fewer.
Posted in Cloud Hosting
Next Page »