Part of the research we are conducting requires us to estimate how detrimental our proposed algorithm to the battery life of the wearables. To do so, I opted to simply charge the wearable to 100% and have it operate for an hour while sampling the battery percentage on a per-minute basis. After that, for example, if the battery level has dropped to 95%, I'll reason that it consumes 5% battery per hour, hence it has a battery life of 20 hours.
What makes me consider this thought is my previous experience with battery-operated devices, specifically smartphones. The time required to have my old phone go from 100% to 95% was significantly longer than it took it to go from, say, 50% to 45%. Furthermore, the former timing was also influenced by how long the phone stood in charge after it was fully charged.
With this observation, I'm not sure if following an experiment procedure as described would yield reliable results. What would be the correct way to proceed with this? The device in question is a series 3 38mm iWatch.
Lastly, I'd like to apologize in advance if I'm not using the correct battery terminology.