Data Coins

What is Data Coins

SCALAR's Data Coins model ensures transparency and fairness in API usage for customers. It provides flexibility while maintaining cost-effectiveness. The Fair Usage Policy (FUP) defines a baseline for data coin consumption across organizations and streamlines usage expectations, contractual agreements, and billing transparency.

  • Datahub Subscriptions offer significant savings in Data Coin usage, especially with larger batch sizes.
  • This means faster delivery, lower costs, and less maintenance, especially for high-volume data transfers.
  • The total cost will be sum of all the individual subscriptions.

Fair Usage Policy

  • Each customer will receive data coins under the Fair Usage Policy.
  • Each organization is attributed 10,000 coins per asset per month, regardless of the subscribed value pack, with a minimum of 300,000 coins per month per fleet!
  • The “minimum of 300 000 coin per month per fleet” rule is designed to accommodate Organizations with fewer than 30 assets, as the standard 10,000 per asset/month might not be sufficient even if the customer respects the Fair Usage policy.
  • As part of the “Core” pack and unrelated to any other value pack, each organization is also attributed 10,000 coins once-only!
📘

Note:

  • In case of Datahub Subscription, avoid using Batch size of 1 unless it is necessary.
  • In case of DHS, batch size 1 results in a greater number of pushes or sends to the webhook endpoint, leading to consumption of lot more coins than usual. Be mindful to use the batch size as 1 only when important and necessary.

Best Practices

  • Avoid collecting the same real-time data more than once. The SCALAR datahub API’s foresee proper selection methods for syncing that should avoid overlapping margins.
  • Make use of the pagination: If you request a large batch of data, which doesn’t fit in a single page response, you will have to make sure that you collect all pages to get the full response.
  • Store your created token and keep re-using this token until it expires. Creating new tokens for every webservice request surpass the limit of how many tokens can be created (3) per account and audience per hour.
  • Despite the adoption of “Fair” and “Unfair” usage, customers are open to use the platform as per their needs. However, in case of Unfair use, this could be detected by means of the credit system as the actual number of data coins consumed is likely to exceed the number of attributed data coins to which the customer is entitled as part of Fair Usage.
  • In case the customer exceeds the number of attributed coins, the supplier has the right to charge the customer for the extra coins used/consumed.

Pool Calculation:

  • The maximum number of “Subscribed” assets during the billing month is used to calculate the total coin pool for an organization.
  • The “subscribed” term refers to any asset type that is equipped with a connectivity unit that communicates and sends data to the SCALAR platform.
  • The counter for total pool availability will be based on the max number of subscribed assets during the billing cycle and will reflect periodically the same in the Datahub portal the next day. Max count of subscribed assets will count towards the billing calculation of Total Used Data coins vs Available Pool of Coins
    For example, an organization with 100 subscribed assets will have a pool of 1,000,000 coins per month (100 x 10,000).

Usage Visualization

Usage Transparency:

  • Metrics Display: Organizations can view used and total coins available.
  • Interface: Display will either be integrated into the DataHub portal and/or a dedicated dashboard.
  • Benefit: Customers can see the real-time insights into their data consumption, helping them manage API calls and avoid overages.

API vs Subscription, Call Weightage Standardization :

Datahub APIs

All API calls will be weighted, with 1 API call = 1 coin

Datahub Subscription-Based Integration

With Datahub Subscriptions, SCALAR automatically pushes data to you in batches — meaning no need to constantly pull data. Data Coin consumption scales based on batch size per subscriptions.

MethodBatch SizeTotal Records sent (in 1 sent/call)Coins consumed (counted/invoiced)
PullN/A1 - 100 (pagination)1
Push111
10101
1001001
5005001

Rule of Thumb:

The larger the batch size, the fewer Data Coins consumed per 100 records in a subscription.

Limit Subscriptions per Event per Integrator to Two

To avoid excessive or redundant subscriptions that can lead to unnecessary system load and potential data duplication, a restriction is introduced allowing each integrator to create no more than two active subscriptions per event type. This ensures more efficient resource usage and enforces better governance, while still offering enough flexibility for integrators to manage primary and backup data flows. It also prevents misuse or over-subscription that can lead to performance bottlenecks or overflow issues downstream.

Set minimum Allowed Wait Time based on Batch Size ranges

To maintain optimal performance and avoid excessive subscriptions pushes/sends or message flooding, the minimum wait time that can be configured for a subscription will now depend on the batch size selected. For small batch sizes (0–100), the shortest permissible wait time will be 5 seconds. For medium batches (101–250), at least 10 seconds is required, and for large batches (251–500), no less than 30 seconds is allowed. These guardrails ensure that data aggregation is efficiently balanced with delivery frequency, minimizing strain on system resources while maintaining timely updates.

Rate limiting

Rate limiting controls how many calls an integrator can make or the number of requests an API can handle in a specific period. The following table describes the rate limiting model, which consists of 3 tiers:

Sr.No.

Tier name

Customer type

Process

1

FREE

Any customer who has only subscribed to “Core” Pack

Automatic

2

PRO

All paying customers get this

Automatic

3

ENTERPRISE

For the cases that require manual intervention to upscale the rate limit

  1. Customer reaches out to the Account Manager or the customer success team with the request to increase the rate limit to Enterprise
  2. Customer discusses this topic with CBO team if there are any monetary implications of this along with confirmation from the Integration team (Pieter-Geert’s team) if there is a genuine need and legitimate reason for the upgrade.
  3. Upon approval from both these teams, Integration team raises the request to Salesforce team to upgrade the Organization to Enterprise tier

The following specific rate limits per type of API:

Organization specific per endpoint (per minute)

FREE (10)PRO (40)ENTERPRISE (60)
small62060
normal660600
large66006000
xlarge69009000

Integrator specific per endpoint (per minute)

FREE (10)PRO (40)ENTERPRISE (60)
small62060
normal640600
large64004000
xlarge66006000

📘

Note:

The first column does NOT relate to fleet size, but rather the type of API. Each API is categorized according to these 4 levels (small, normal, large, xlarge). So, rate limits are defined based on the category that was assigned to each specific API.

Why Organisation vs. Integrator rate limit tables is differentiated?

Integrator rate limits are slightly lower than the Organization limits so that a single integrator under an organization does not consume all the rate limit available to the organization, thereby disabling other integrators.