Zero Waste. More Money for R&D.

Cloud capacity is purchased assuming a ‘theoretical max’ workload (basically 100% utilization) to avoid performance issues when encountering a surge in resources needed, for example, a spike in website traffic. In reality, surges are rare and resources go staggeringly underutilized. Yet as the consumer, you are responsible to pay for your full allotment of resources - including the unused ones.

Currently, the industry-accepted approach to billing is based upon “fixed” prices for the total resources being supplied to host your workload. This legacy approach solely benefits the provider and comes as a significant financial burden for the consumer, whether you realize it or not.

… Regardless of what Amazon or Microsoft says, you are not actually paying for what you use. The resources they supply you are being measured in time, instead of by the amount of resources actually consumed, like electricity.

The chart below is an example of cloud usage being measured according to resources consumed using our software.

Screen Shot 2018-11-14 at 1.29.32 PM.png

The Cloud Gauge

The Cloud Gauge is ‘Made in USA’ software that measures the consumption of the underlying resources within a computing environment. It works similar to how an electricity meter measures WATT consumption when appliances are powered on.

The Cloud Gauge is available today in a cloud-hosted version, or can be installed to run exclusively behind your firewall (coming soon).

Workload Consumption Unit (WCU)

The Cloud Gauge utilizes an algorithm to convert the usage of different server components (Compute, Graphics, Storage, Network) into a standardized unit of measure, known as the WCU. Think of the WCU like electricity’s WATT, but for measuring compute processing power. From a usage perspective, the WCU is equivalent regardless of the computing environment or the underlying hardware on which an application runs on.