[This is a cross-post originally posted by me in SO. I think the question is more appropriate here.]
I was going through the adwords API and came across their rate sheet - http://code.google.com/apis/adwords/docs/ratesheet.html .
They charge $0.25 per 1000 API units and under the 'Operation Costs' sections list the cost (in API units) of different API calls. I am curious - based on what factors do they (and others API developers) calculate the cost of an API call? Is there any simple formula or a standard way to determine this?
Note: When I say 'cost' of an API call, I don't mean the money but the API units. For example, how do you determine one API call costs 100 'units' and another 1000?
Expanding my original post: Each API call will consume a specific amount of CPU cycles, memory and bandwidth on the server (and perhaps some other metric I haven't thought of). I would specifically like to know what tools / methods are available to determine these metrics if the API is developed using (any of the below combination):
Perl, Python, PHP, .net, Ruby (or some other language).
Postgres, MySQL, MS SQL Server, Firebird, SQLite (or some other database).
Apache, IIS (or some other webserver).
Windows, Linux (or some other OS).