Think your Microsoft licensing costs are fixed? They're not.
Once upon a time, there were no limits for Microsoft customers on the size of their API workloads. You could have 1,000 or 100,000 calls a day without penalty. But that all changed in October 2019 when Microsoft revised how they report on – and charge for API calls to a customer's instance of Dynamics 365 online. As a result, customers' costs have, or are about to depending on when your current agreement renews, skyrocket, and in some cases (should they not seek expert advice), their solutions are now potentially unaffordable.
For many customers, seeing their Microsoft costs escalate dramatically (doubling and even tripling) is a sobering moment. But in all fairness, the Microsoft licensing model has changed to reflect a more modern approach to consumption - and they're not the only big vendor in the market to do so. Microsoft's new service agreement and metrics are a way to ensure that their paying customers receive the best possible service and platform deliverability.
How does that work, exactly? Let's say Customer A in a multi-tenancy infrastructure has low API usage to the backend of their Microsoft solution. But their neighbour, Customer B, is an extremely heavy user of APIs. The bad news is that Busy B can impact the performance of the tenancy for Average-user A and everyone else, especially when this consumption is a “surprise” or unexpected by Microsoft, as this spike impacts the ability of the data centre to perform at its optimum.
Microsoft's introduction of the SLA and metrics mechanism ensures uptime quality of service and performance - and provide some fairness. Those customers with low API workloads won't have their performance impacted by those with heavy integration workloads. And those who do have significant workloads will have to pay for their overt consumption.
That's all well and good. But when you go from never having your API workload measured, to paying by call, it can add a considerable financial burden - which although mitigated by the brilliant platform you have access to – can be a huge shock.
So how can we help you? You'll be (very) pleased to know that we can actively optimise and reduce your API workload, and therefore your costs.
Let's take another look at Customer B. By reviewing the common data services, we can see that over the previous month Customer B made 7.5 million API calls, or around 250,000 calls a day. The Microsoft metric requires you to be below the magic 100,000 number per day, meaning Customer B needs to purchase a further 150,000 calls per day over their base entitlement to run their business. (Before the new licensing model, it wouldn't have been a problem!).
Now, as a workaround, we can be smart and go and check out Customer B's integration workloads, drill down in detail to the services, and unpack how they're written to see if we can optimise the workloads. For example, instead of individually extracting 100,000 contacts from their database for a marketing list (and that's 100,000 API calls), we can use a different statement query and do multiple retrieves of 5000 names at a time - which would take just 40 API calls vs 100,000.
Let's say things have been really, really busy for Customer B. And they are now making 35,000,000 API transactions a month, which is a $20,000 increase per month from their pre-Oct 2019 plan. Gulp.
But never fear, we can optimise for that too.
If you do run over your API workload allocation, your administrator will get an automated notification from Microsoft requesting you to purchase more digital storage. Don't panic though, if you go over your API limit, nothing will 'stop', but until you increase your storage, there are limitations on what you can do in your environment. For example, you won't be able to create a new sandbox environment, or a power portal because you need one gigabyte of available capacity in the database application.
Your choices are to 1. buy more storage, or 2. look at the configuration of your integrations. We can guarantee that investing in reconfiguration is more affordable than purchasing more monthly storage ad infinitum!
You can also take a more proactive approach to minimise your costs by using your storage more effectively. For instance, you may have set up an internal workflow to track every business email and attachment sent to your call centre, which would chew through your file storage capacity and make it prohibitively expensive to manage. So taking a consultative approach to designing, using, and training will make a significant difference to your bottom-line. Even using a process to extract attachments from emails and save them to SharePoint will make a huge difference. (A gig of storage on SharePoint is about 3 cents compared with $3 a gig in your Azure storage infrastructure, so you can imagine how that helps keep your costs down!).
Then, there's load capacity. You only have an allowance of 2 gigs of storage on the backend of any Dynamics 365 application or Power Apps. It only takes someone to configure some of the auditing requirements for either, and suddenly you've created a storage problem. Especially at $15.10 per gig.
What can you do now to make sure than when the new SLA metrics apply to you, it's not going to blow your budget?
We can analyse your environment for you, and use trend analysis to project the potential costs of not making changes. Then we can recommend actions like database archiving (moving from the current layer into a more cost-effective one) and create virtual entities to point at the data, review the logs you have and establish if they are required or not.
Whether it's API calls or storage, we can't change the rules for you. But we have some smart ideas to help you find your way around paying over the odds for what you use.