7 Dark Secrets Behind Cloud Tariffs

Read Time:6 Minute, 51 Second

But why on earth is our cloud provider’s bill exploding every month when the prices it advertises are so low? Our colleagues from IDG provide seven answers to this question.

Hidden price

Is there anything more attractive than cloud pricing? Few of us today are old enough to remember paying a dime, but cloud users are getting even lower prices. Google’s standard N1 machine is priced at $ 0.0475 an hour ($ 0.0421), but you can get it for just $ 0.01 an hour for batch processing needs. The big spenders can upgrade to the processor version for $ 0.015 an hour.

Azure does its calculations on the per gigabyte consumed, with a ridiculous price of 0.00069 € per gigabyte to store data for a month in its Archive Storage tier. It’s Amazon, however, that offers the lowest prices by charging an infinitesimal amount of $ 0.0000002083 for 128MB of memory to run code on Lambda. These figures are puzzling. Auto and home insurance bills might be squeezing the budget, but when it comes to the cloud, it would almost be fun to throw money down the drain like confetti. Because precisely the price of these cloud services is literally lower than the cost of a piece of confetti … Then comes the end of the month, and finally the bill from the cloud provider approaches those sent by insurers … How do these fractions pennies add up so quickly? Our colleagues at IDG have shed light on seven secrets of the cloud that show why such low prices ultimately greatly increase an IT budget.

The little hidden extras

Sometimes the most grandiose shows are performed by extras that you don’t notice. Amazon’s Glacier S3 has a “Deep Archive” tier designed for long-term backups, which is attractively priced: $ 0.00099 per gigabyte, or $ 1 per terabyte per month. It’s easy to imagine throwing backup tapes in the trash to make Amazon’s service simpler. But suppose you really want to examine this data. By clicking on a second tab on the pricing sheet, the cost of recovery is $ 0.02 per gigabyte. It is therefore 20 times more expensive to consult the data than to store it for a month. If a restaurant used this pricing model, they would charge € 2 for the dish ordered, but € 40 for the cutlery. Reduced to the realities of backup, this grid remains sensible since the product has been designed to allow long-term storage and not just navigation. For frequent access to stored content, it is preferable to subscribe to the normal S3 level. But if the goal is to save on archive storage, you need to understand these secondary costs and plan for them in advance.

Location is important

Cloud providers are putting glitter in our eyes with planispheres showing their data centers around the world, and inviting workloads to be placed where it is most convenient and secure. But the prices are not always the same. Amazon may charge $ 0.00099 per gigabyte in Ohio, but it’s $ 0.002 / GB in Northern California. In France, we fall to 0.0018 $ / GB per month. Is it because of the climate? Proximity to the beach? Or just the cost of real estate? The secret remains here. Alibaba, for its part, clearly wants to encourage developers to use its cloud in its data centers around the world. Low-end instances start at just $ 2.5 per month outside of China, drop to $ 7 per month in Hong Kong, and $ 15 per month in mainland China.

Beware of data transfers

The problem with looking at these price lists and moving the workloads to the cheapest data centers is that cloud companies charge for these data transfers as well. So by migrating to estimate costs across all of the vendor’s sites in search of the cheapest compute and storage, the bills may actually skyrocket. Because the costs of circulating data on the network are surprisingly high. An occasional gig won’t make too much of a difference, but it can be a big mistake to reproduce a frequently updated database across the country every millisecond, just because an earthquake or hurricane may occur.

The mouse trap

This kind of trap consists of baiting a mouse in a box that closes once the rodent has caught the end of the cheese, so as to trap it without killing it. The same can be said for cloud infrastructure users when it comes to migrating their data. Often, cloud computing providers do not charge to import data into their system. In everyday life, would a store charge a customer at the entrance? But when it comes to leaving, the bill immediately becomes infinitely higher.

The sunk cost error

There is always a point where the machine or configuration struggles to function. But in cloud mode, you just have to scale the computing power and everything will be better. And this, only for a few cents more per hour. On an invoice already at several dollars an hour, we tell ourselves that a few extra cents will not change much. And the suppliers make this transition much easier, with one click everything is ready. Casinos have the same strategy for getting players to get their hands on the wallet. But for accountants, this behavior is a sunk cost error. That is to say, simply throwing money away. While the money already spent is something that will never come back, future spending can be controlled.

It is however a little different when it comes to software development. Often, it is not clear how much memory or CPU will be required for a feature. It will be necessary to increase the power of the machines from time to time. The real challenge is keeping an eye on the budget and keeping costs under control along the way. You just have to add a little more CPU here or memory there so that at the end of the month, the bill explodes.

Thank you overhead

A cloud machine is not a machine per se, but a part of a larger physical machine that has been divided into N portions. These slices, however, are not powerful enough to handle the load on their own, so you have to deploy tools like Kubernetes to make N portions work together. Why then cut a big box into N pieces just to sew it up? Why not settle for just one machine to handle a single load? Cloud evangelists will say that people who ask sassy questions like these are missing out on the benefits of the cloud. All the extra layers and extra copies of the operating system provide a lot of redundancy and flexibility. We should be grateful that all of these instances kick in and stop in an elaborate and orchestrated dance.

But the ease of recovery with Kubernetes encourages sloppy programming. A node failure is not a problem because the pod will continue to browse when Kubernetes replaces the instance. So the overhead is a bit more expensive to maintain the extra layers, so that you can start a clean and fresh machine without any impromptu issues.

To infinity and beyond ?

Ultimately, the tricky problem with cloud computing is that its best feature is also its most dangerous. Its ability to adapt to any so-called endless demand is also a financial minefield. Is each user going to have an average of 10 gigabytes of output or 20 GB? Will each server need 2 or 4 GB of RAM? When launching a project, it’s impossible to know. The traditional strategy is to buy a fixed number of servers but it becomes less easy to maintain when demand increases, but at least the budgetary costs do not skyrocket. Engineers on the server side will complain about the overload and users about the slow response, but the accounting team will not have an attack at the end of the month.

Because if no one will notice when costs drop, when the meter starts to run at full speed, the boss is likely to start paying more attention to what his IT infrastructure costs him. The bottom line is that our bank accounts aren’t as scalable as the cloud.

About Post Author

Happy
Happy
0 %
Sad
Sad
0 %
Excited
Excited
0 %
Sleepy
Sleepy
0 %
Angry
Angry
0 %
Surprise
Surprise
0 %
5g: Still Very Modest Coverage According To Arcep Previous post 5G: still very modest coverage according to Arcep
Microsoft Forces Pcs Running Windows 10 1809 To Move To Next post Microsoft forces PCs running Windows 10 1809 to move to 1909