local cloud server
On October 26, 2023, Oxide Computer Company announced the start of sales of “the world’s first commercial cloud computer”, which was announced back in 2020 at the Stanford lecture “The Soul of a New Machine. Rethinking the server computer.
Not everyone immediately understood how 0xide differs from a regular server, and why it is called a “cloud computer” in marketing materials.
To begin with, it makes sense to look into the past for a moment.
▍ A brief history of servers
It can be considered the first server in the modern sense
(1957). At least this is the first multitasking computer in history. It was possible to download several tasks to him, and he performed them “simultaneously”. This 70-ton server occupied a 14×13 meter room (plus a smaller room for air conditioning):
IBM 709 front panel, source
The next stage was the PDP-11 (1970), then for decades new generation mainframes, mainly IBM, Sun Microsystems, Silicon Graphics (they bought the Cray company), etc., set the trend.
For example, in the 1990s, Sun Microsystems released corporate servers in the Sun Enterprise line, such as:
Sun Starfire 10000 (64 UltraSPARC II processors, Solaris operating system) cost about $1 million. Large sites of the dot-com era worked on such servers (for example, eBay took up an entire cabinet)
Nowadays, these commercial mainframes look like exotics. In the 2000s, customers began to buy or lease mostly standard 1U or 2U rack-mounted servers (also standard sizes).
These servers kept the standard architecture of a personal computer and did not differ too much from it: here are the same x86 processors (server version), the same drives, memory, ports (USB, HDMI, etc.), VGA connector, drive CD, etc.
HP ProLiant DL380 G6 Server (2011), source
Nowadays, these servers have evolved into something like this:
HPE ProLiant DL380 Gen10 Server
The CD drive has been replaced with a DVD and the VGA connector with HDMI, but it’s the same old PC architecture
At the same time, companies like Google decided to create their own design of servers that are ideal for data centers. This is how the modular design of the Hyperscale type (2009) appeared: the cheapest possible iron, standard components that are easily replaced:
Google Server (2009)
Here, all unnecessary elements of a personal computer were removed from the servers, leaving only the necessary “skeleton”. Nowadays, these servers have evolved into something like this:
Server Facebook Tioga (2018) with Open Compute Project open architecture
Bryce Canyon data storage for 72 3.5″ drives
The creators of the 0xide servers saw the problem. On the one hand, companies have a high demand for installations own serversunder your control, not in the data center.
On the other hand, there are only two types of servers on sale: universal personal computers of 1U/2U format and unique Hyperscale type developments for data centers. It is not clear what to put in the server room of the office building.
It is also bad that the hardware on the servers is not optimized for specific software, and the software is not optimized for specific hardware. All products are sold as universal, that is, any OS and software can be installed on the server, and problems with project development, software installation, integration and support fall on the shoulders of the buyer.
This is how the 0xide computer was born, a unique development designed to solve these problems. This is a new approach to servers.
▍ Cloud computing 0xide
0xide cloud computer is a new generation server that is installed at the client, but at the same time works as a dedicated server in a remote data center. That is, the client practically does not deal with its maintenance and support, everything is automated as much as possible.
0xide is a server rack with 16 servers, plus two 32-port switches (left) and a patch panel (right):
It resembles virtual hosting. The built-in software immediately offers to allocate resources by VM instances:
You can put a separate OS on each VM, allocate a certain number of processors and memory.
That is, the client receives something like ordinary virtual hosting, only locally at home. Virtual private cloud. No fees for resources, as is customary in conventional clouds.
Built-in software functions:
- Integrated metrics and telemetry, load and performance monitoring.
- Full integration of software with hardware (hypervisor, control panel, data storage system, API endpoints for developers and users). In fact, proprietary software is the main feature of 0xide, because the hardware there is quite standard (see below).
- Allocation of resources only at the level of projects (quotas).
- Quick self-configuration of the network and routing. Management is available via API and console.
- Computing modules (sleds): 16, 24 or 32
- Processors (x86 cores): from 16 to 32 AMD Milan EPYC 7713P processors (1024-2048 cores)
- Memory (DRAM): 16 – 32 TiB
- Refuge: 465.75 − 931.5 TiB (NVMe of 3.2 TB, in trays of ten U.2 2.5″)
- Switches (L2/L3): 12.8 Tbit/s
- Power sources: 6 (5+1)
- Energy consumption (normal/maximum): 12/15 kW
- Dimensions: 2354×600×1060 mm.
- Weight: 1145 kg
- Maximum heat output: 61,416 BTU/h
- Air flow requirements: 145.8 × kVA CFM
Such computing resources (2048 cores, 32 TB of memory) in a conventional cloud will cost a huge amount. And here is one server. It requires no assembly or setup. I bought it, installed it — and everything works right away, like virtual hosting or the cloud, only
no resource charges.
The American company Oxide Computer was formed in 2019, recruited employees in 2020, released the first functional prototype in 2021, and only at the end of 2023 began accepting orders for the supply of servers, without publishing prices in public access. It is not yet known how successful this idea will be, but the first reviews are positive, that is, people liked it.
▍ Care from the cloud: financial gain
Numerous stories show that moving from the cloud to self-hosted
has a direct financial benefit
. Both server rental and own hardware are equally profitable compared to unreasonably inflated rates for hosting traffic of commercial hostings such as AWS. For example, an ordinary VPS for a couple of hundred rubles comes with unlimited traffic, but in the cloud
these terabytes of traffic will cost thousands of dollars
The costs of companies on cloud hosting exceeded $100 billion in 2020 and continue to grow.
Some companies pay up to 75-80% of their revenue for the cloud, this is a normal phenomenon for software companies.
The cost of cloud services as a percentage of the revenue of some software companies based on public financial statements, source
What costs (losses) for companies is profit (surplus profit) for cloud providers. According to experts, the prices for public cloud services can be 10-12 times higher than the cost of operating one’s own data centers. AWS’s operating margin, including all infrastructure and development costs, is around 30%, which is very high.
There is an opinion that we are through the cloud overcomplicating our systems. Everything can be arranged much easier on your hosting, without hundreds of microservices for a simple web application with a maximum of thousands of simultaneous users.
AWS customers complain about the monopoly’s unfair competition-like tricks. AWS executives themselves say that customers’ clouds must be constantly optimized so that they do not get into too much debt due to price imbalances. There are cases when a small startup flies away for thousands of dollars in one day.
From December 2023, Google will systematically delete files from inactive accounts, and Google Drive will lose user files. And Amazon can accidentally delete your Elasticache clusters. It is clear that the number of people willing to switch to their own hosting, where everything is under control, is growing.
▍ Local cloud on your server
If a cloud service is a really convenient option for a startup or a small company, then for a large organization, the question of rational spending of funds arises. Exaggerated rates for traffic and computing in cloud hosting lead to the fact that in some cases
cheaper to raise your own servers. The main thing, as the company grows, is not to miss the moment of reaching the equilibrium point, when
local hosting becomes more profitable than cloud hosting
. On a large scale, migration to self-hosting can save tens of millions of dollars, like Dropbox, which
migrated 600 PB of data from AWS to its cluster
. For the largest software companies, moving away from the cloud can reduce costs by 50% on average.
0xide servers are only the first swallow in the movement to the “local cloud”. You can bring up the AWS emulator yourself on your own servers. For example, local LocalStack.cloud emulates more than 80 AWS services:
This is useful for on-premises testing of cloud applications and reducing cloud service costs. Development and testing in local the cloud is faster and more efficient than in this, i.e. plus, the speed of development increases.
Nowadays, anyone can set up a local cloud to duplicate/replace some of the expensive services of this cloud. Migrate on-prem partially and gradually, without sudden movements.
Help the satellite fight the space debris in our new game! 🛸