Frequently Asked Questions
A data centre is a specialised facility that houses computer systems, storage devices, and associated components such as telecommunications and data storage systems. It’s also the home to redundant power systems, environmental controls like air conditioning and fire suppression, and various security devices. These facilities are vital for the smooth functioning of various businesses and organisations, as they provide a safe, secure environment for data and equipment, with constant monitoring to ensure optimal performance.
Data centres are used to centralise an organization’s shared IT operations and equipment for the purposes of storing, processing, and disseminating data and applications. They provide important services such as data storage, backup and recovery, data management, networking, and distributing large amounts of data. They’re essential to the functioning of modern businesses, both for their day-to-day operations and strategic initiatives like business intelligence and machine learning.
Data centres come in several types, each designed for a specific kind of need. The main types are:
- Enterprise data centres: These are owned and operated by the company they serve. They are typically located on the company’s campus.
- Managed services data centres: In this setup, a company leases servers in a data centre, but the data centre staff manages the servers.
- Colocation data centres: Here, a company rents space within a data centre to host its servers and other hardware.
- Cloud data centres: These are operated by cloud service providers like Amazon Web Services or Google Cloud. They offer scalable resources, which can be adjusted according to the needs of the customer.
- Edge data centres: These are smaller facilities located closer to the populations they serve, designed to deliver cloud computing resources and cached content to end-users with less latency.
A colocation (colo) data centre is a facility where businesses can rent space for servers and other computing hardware. In a colo, the data centre provides the building, cooling, power, bandwidth, and physical security, while the customer provides and manages the servers. This enables businesses to benefit from the economies of scale associated with large data centres — including redundant power and cooling resources, and high-speed network access — without having to manage those resources themselves.
Data centre connectivity refers to the networking and interconnection methods used within and between data centres. This includes everything from the basic physical infrastructure — such as routers, switches, and fibre-optic cables — to the protocols and services that allow for data transmission. Good connectivity is essential for ensuring fast, reliable access to data and services.
A carrier-neutral data centre, also known as a network-neutral data centre, is a type of data centre that allows interconnectivity between multiple telecommunication carriers and/or colocation providers. Customers can choose from a wide range of services without being locked into a single carrier. This fosters competition and can help drive down costs and increase service quality for the customer.
A peering point, also known as an internet exchange point (IXP), is a physical location where networks come together to exchange traffic directly, rather than through a third party. Peering points help reduce costs, improve speed, and increase redundancy by allowing data to take more direct routes between networks.
Data centre security involves various measures and technologies designed to protect a data centre from threats both physical and digital. Physical security measures include access controls, surveillance systems, and secure facility design. Digital security measures can include firewalls, intrusion detection systems, network segmentation, and encryption. The goal is to protect the data stored in the centre from unauthorized access, as well as protect the infrastructure from disruptions or damage.
Private cloud refers to computing services offered either over the Internet or a private internal network to select users only. Public cloud is the standard cloud computing model where service providers make resources, such as applications and storage, available to the general public over the Internet. Hybrid cloud is a solution that combines a private cloud with one or more public cloud services, with proprietary software enabling communication between each distinct service.
Cloud computing is a service that delivers shared computing resources, software, or data as a service over the Internet. With cloud services, you do not need to physically maintain or manage servers or storage. Colocation, on the other hand, involves renting physical space within a data centre to house your own hardware. You maintain and manage your servers, storage, and network, while the colocation provider takes care of the building and its utilities.
A cloud gateway is a network node connecting two networks that use different network protocols. It acts as a bridge between your local environment (like a data centre or office) and a cloud service provider. Cloud gateways enable businesses to securely connect their existing networks to cloud services, facilitating data transfer and offering services such as data encryption and loss prevention.
Selecting a data centre outside of London can offer several benefits. It can provide geographic diversity and risk mitigation in the event of a localized disaster in London. Additionally, data centres outside London might offer lower costs due to reduced real estate prices. They can also provide lower latency connections to regions closer to the data centre’s location.
Data centre sustainability involves designing and managing data centres in a way that reduces their environmental impact. This can involve a range of strategies, such as using energy-efficient hardware, improving cooling efficiency, powering facilities with renewable energy, and responsibly disposing of or recycling old equipment.
PUE, or Power Usage Effectiveness, is a metric used to determine the energy efficiency of a data centre. It is calculated by dividing the total amount of power consumed by a data centre by the power consumed by the IT equipment within it. The closer a PUE value is to 1.0, the more efficient a data centre is considered to be.
High-Performance Computing (HPC) involves the use of supercomputers and parallel processing techniques to solve complex computational problems. HPC systems have the ability to process data at high speeds and handle large data volumes. They’re often used in fields that require high-powered data processing, such as climate and weather modelling, quantum mechanics, and bioinformatics.
High-Performance Computing (HPC) and Artificial Intelligence (AI) are two distinct but interconnected fields. HPC is about performing complex calculations at high speeds, often used in scientific research and data analysis. AI, on the other hand, is about creating systems that can perform tasks that would require human intelligence, such as understanding natural language or recognizing patterns in data. HPC can be used to power AI applications, as AI often requires processing large amounts of data quickly, something that HPC systems are designed to do.
Data centre scalability refers to the ability of a data centre to handle increased demands; in other words, to scale its operations up or down as needed. This could mean adding more servers or storage, increasing network capacity, or even reducing resources when they are not needed. Scalability is a crucial aspect of data centre design and operation, as it allows a data centre to meet the changing needs of its users in an efficient, cost-effective manner.
Data centres are powered by a combination of the local electricity grid and backup power systems to ensure continuous operation. The backup systems can include uninterruptible power supplies (UPS), which provide immediate short-term power in case of a grid outage, and generators, which can provide longer-term power. Data centres often have multiple power feeds and backup systems to ensure redundancy. Increasingly, data centres are also making use of renewable energy sources to reduce their environmental impact.
Data centre cooling is essential to keep servers and other hardware at optimal temperatures. Too much heat can lead to performance issues or damage. There are several methods of cooling, including air conditioning, liquid cooling, and containment strategies. The goal is to remove the heat produced by the equipment and keep the temperature and humidity levels within a range that is safe for the equipment.
In a colocation data centre, services typically include the rental of space (which can range from a rack to a cage or a dedicated room), power, cooling, physical security, and network connectivity. Additional services can include remote hands (onsite technical support), disaster recovery, load balancing, backup services, and more. The specific offerings can vary by provider and are often customisable to meet the unique needs of each client.
When selecting a colocation data centre, consider factors such as the location (for latency, disaster risk, and legal compliance reasons), physical and digital security measures, reliability (uptime), scalability (ability to expand as your needs grow), and the level of support services available. It’s also important to assess the reputation and financial stability of the provider.
Download your free Colocation choices UK eBook
Complete the form to download your free colocation choices eBook
(Via our corporate website)
+44 (0)1633 988 035