I'm a student new to sensors. I'm currently calibrating the gyroscope from the following IMU: ICM42670. I was able to get the bias by leaving the gyro for some time and getting the average of the errors. I am attempting to get the scale factor by rotating the gyro to a known angle (180 degrees) and seeing the offset but I'm having trouble integrating the data.
I configured the gyroscope to have a range of +-250dps and sensitivity of 131 LSB/(dps). The gyro also has an output data rate of 50Hz and a low pass filter at 53Hz (I can change this in case that this is the problem). These are my steps for integrating:
- get raw data from the gyro with a sampling rate of 50Hz as well (0.02s apart). I rotated the gyro 180 degrees and back
- I loaded the data onto Matlab
- converted it to deg/sec by dividing by the sensitivity.
- Applied previous found bias (which was almost 0)
- Used the cumtrapz() function to integrate
This is that code snippet:
dT = 1/50;
rate_raw = importdata("esp\gyro_calibration\gyro_integration_data.txt");
rate = (rate_raw / 131) + y_bias; % convert to deg/sec and add bias
angle = cumtrapz(rate*dT);
t = linspace(0,10,length(angle));
figure(2)
plot(t, angle, 'LineWidth',1);
grid on
title("Gyroscope angle output");
The output is at exactly around 90 degrees which is half and I've spent a whole day trying to understand why. Any help is greatly appreciated!