Using Android gyroscope instead of accelerometer. I find lots of bits and pieces, but no complete code

24,125

Solution 1

Well, +1 to you for even knowing what a Kalman filter is. If you'd like, I'll edit this post and give you the code I wrote a couple years ago to do what you're trying to do.

But first, I'll tell you why you don't need it.

Modern implementations of the Android sensor stack use Sensor Fusion, as Stan mentioned above. This just means that all of the available data -- accel, mag, gyro -- is collected together in one algorithm, and then all the outputs are read back out in the form of Android sensors.

Edit: I just stumbled on this superb Google Tech Talk on the subject: Sensor Fusion on Android Devices: A Revolution in Motion Processing. Well worth the 45 minutes to watch it if you're interested in the topic.

In essence, Sensor Fusion is a black box. I've looked into the source code of the Android implementation, and it's a big Kalman filter written in C++. Some pretty good code in there, and far more sophisticated than any filter I ever wrote, and probably more sophisticated that what you're writing. Remember, these guys are doing this for a living.

I also know that at least one chipset manufacturer has their own sensor fusion implementation. The manufacturer of the device then chooses between the Android and the vendor implementation based on their own criteria.

Finally, as Stan mentioned above, Invensense has their own sensor fusion implementation at the chip level.

Anyway, what it all boils down to is that the built-in sensor fusion in your device is likely to be superior to anything you or I could cobble together. So what you really want to do is to access that.

In Android, there are both physical and virtual sensors. The virtual sensors are the ones that are synthesized from the available physical sensors. The best-known example is TYPE_ORIENTATION which takes accelerometer and magnetometer and creates roll/pitch/heading output. (By the way, you should not use this sensor; it has too many limitations.)

But the important thing is that newer versions of Android contain these two new virtual sensors:

TYPE_GRAVITY is the accelerometer input with the effect of motion filtered out TYPE_LINEAR_ACCELERATION is the accelerometer with the gravity component filtered out.

These two virtual sensors are synthesized through a combination of accelerometer input and gyro input.

Another notable sensor is TYPE_ROTATION_VECTOR which is a Quaternion synthesized from accelerometer, magnetometer, and gyro. It represents the full 3-d orientation of the device with the effects of linear acceleration filtered out.

However, Quaternions are a little bit abstract for most people, and since you're likely working with 3-d transformations anyway, your best approach is to combine TYPE_GRAVITY and TYPE_MAGNETIC_FIELD via SensorManager.getRotationMatrix().

One more point: if you're working with a device running an older version of Android, you need to detect that you're not receiving TYPE_GRAVITY events and use TYPE_ACCELEROMETER instead. Theoretically, this would be a place to use your own kalman filter, but if your device doesn't have sensor fusion built in, it probably doesn't have gyros either.

Anyway, here's some sample code to show how I do it.

  // Requires 1.5 or above

  class Foo extends Activity implements SensorEventListener {

    SensorManager sensorManager;
    float[] gData = new float[3];           // Gravity or accelerometer
    float[] mData = new float[3];           // Magnetometer
    float[] orientation = new float[3];
    float[] Rmat = new float[9];
    float[] R2 = new float[9];
    float[] Imat = new float[9];
    boolean haveGrav = false;
    boolean haveAccel = false;
    boolean haveMag = false;

    onCreate() {
        // Get the sensor manager from system services
        sensorManager =
          (SensorManager)getSystemService(Context.SENSOR_SERVICE);
    }

    onResume() {
        super.onResume();
        // Register our listeners
        Sensor gsensor = sensorManager.getDefaultSensor(Sensor.TYPE_GRAVITY);
        Sensor asensor = sensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER);
        Sensor msensor = sensorManager.getDefaultSensor(Sensor.TYPE_MAGNETIC_FIELD);
        sensorManager.registerListener(this, gsensor, SensorManager.SENSOR_DELAY_GAME);
        sensorManager.registerListener(this, asensor, SensorManager.SENSOR_DELAY_GAME);
        sensorManager.registerListener(this, msensor, SensorManager.SENSOR_DELAY_GAME);
    }

    public void onSensorChanged(SensorEvent event) {
        float[] data;
        switch( event.sensor.getType() ) {
          case Sensor.TYPE_GRAVITY:
            gData[0] = event.values[0];
            gData[1] = event.values[1];
            gData[2] = event.values[2];
            haveGrav = true;
            break;
          case Sensor.TYPE_ACCELEROMETER:
            if (haveGrav) break;    // don't need it, we have better
            gData[0] = event.values[0];
            gData[1] = event.values[1];
            gData[2] = event.values[2];
            haveAccel = true;
            break;
          case Sensor.TYPE_MAGNETIC_FIELD:
            mData[0] = event.values[0];
            mData[1] = event.values[1];
            mData[2] = event.values[2];
            haveMag = true;
            break;
          default:
            return;
        }

        if ((haveGrav || haveAccel) && haveMag) {
            SensorManager.getRotationMatrix(Rmat, Imat, gData, mData);
            SensorManager.remapCoordinateSystem(Rmat,
                    SensorManager.AXIS_Y, SensorManager.AXIS_MINUS_X, R2);
            // Orientation isn't as useful as a rotation matrix, but
            // we'll show it here anyway.
            SensorManager.getOrientation(R2, orientation);
            float incl = SensorManager.getInclination(Imat);
            Log.d(TAG, "mh: " + (int)(orientation[0]*DEG));
            Log.d(TAG, "pitch: " + (int)(orientation[1]*DEG));
            Log.d(TAG, "roll: " + (int)(orientation[2]*DEG));
            Log.d(TAG, "yaw: " + (int)(orientation[0]*DEG));
            Log.d(TAG, "inclination: " + (int)(incl*DEG));
        }
      }
    }

Hmmm; if you happen to have a Quaternion library handy, it's probably simpler just to receive TYPE_ROTATION_VECTOR and convert that to an array.

Solution 2

To the question where to find complete code, here's a default implementation on Android jelly bean: https://android.googlesource.com/platform/frameworks/base/+/jb-release/services/sensorservice/ Start by checking the fusion.cpp/h. It uses Modified Rodrigues Parameters (close to Euler angles) instead of quaternions. In addition to orientation the Kalman filter estimates gyro drift. For measurement updates it uses magnetometer and, a bit incorrectly, acceleration (specific force).

To make use of the code you should either be a wizard or know the basics of INS and KF. Many parameters have to be fine-tuned for the filter to work. As Edward adequately put, these guys are doing this for living.

At least in google's galaxy nexus this default implementation is left unused and is overridden by Invense's proprietary system.

Share:
24,125

Related videos on Youtube

HappyEngineer
Author by

HappyEngineer

Updated on July 09, 2022

Comments

  • HappyEngineer
    HappyEngineer almost 2 years

    The Sensor Fusion video looks great, but there's no code: http://www.youtube.com/watch?v=C7JQ7Rpwn2k&feature=player_detailpage#t=1315s

    Here is my code which just uses accelerometer and compass. I also use a Kalman filter on the 3 orientation values, but that's too much code to show here. Ultimately, this works ok, but the result is either too jittery or too laggy depending on what I do with the results and how low I make the filtering factors.

    /** Just accelerometer and magnetic sensors */
    public abstract class SensorsListener2
        implements
            SensorEventListener
    {
        /** The lower this is, the greater the preference which is given to previous values. (slows change) */
        private static final float accelFilteringFactor = 0.1f;
        private static final float magFilteringFactor = 0.01f;
    
        public abstract boolean getIsLandscape();
    
        @Override
        public void onSensorChanged(SensorEvent event) {
            Sensor sensor = event.sensor;
            int type = sensor.getType();
    
            switch (type) {
                case Sensor.TYPE_MAGNETIC_FIELD:
                    mags[0] = event.values[0] * magFilteringFactor + mags[0] * (1.0f - magFilteringFactor);
                    mags[1] = event.values[1] * magFilteringFactor + mags[1] * (1.0f - magFilteringFactor);
                    mags[2] = event.values[2] * magFilteringFactor + mags[2] * (1.0f - magFilteringFactor);
    
                    isReady = true;
                    break;
                case Sensor.TYPE_ACCELEROMETER:
                    accels[0] = event.values[0] * accelFilteringFactor + accels[0] * (1.0f - accelFilteringFactor);
                    accels[1] = event.values[1] * accelFilteringFactor + accels[1] * (1.0f - accelFilteringFactor);
                    accels[2] = event.values[2] * accelFilteringFactor + accels[2] * (1.0f - accelFilteringFactor);
                    break;
    
                default:
                    return;
            }
    
    
    
    
            if(mags != null && accels != null && isReady) {
                isReady = false;
    
                SensorManager.getRotationMatrix(rot, inclination, accels, mags);
    
                boolean isLandscape = getIsLandscape();
                if(isLandscape) {
                    outR = rot;
                } else {
                    // Remap the coordinates to work in portrait mode.
                    SensorManager.remapCoordinateSystem(rot, SensorManager.AXIS_X, SensorManager.AXIS_Z, outR);
                }
    
                SensorManager.getOrientation(outR, values);
    
                double x180pi = 180.0 / Math.PI;
                float azimuth = (float)(values[0] * x180pi);
                float pitch = (float)(values[1] * x180pi);
                float roll = (float)(values[2] * x180pi);
    
                // In landscape mode swap pitch and roll and invert the pitch.
                if(isLandscape) {
                    float tmp = pitch;
                    pitch = -roll;
                    roll = -tmp;
                    azimuth = 180 - azimuth;
                } else {
                    pitch = -pitch - 90;
                    azimuth = 90 - azimuth;
                }
    
                onOrientationChanged(azimuth,pitch,roll);
            }
        }
    
    
    
    
        private float[] mags = new float[3];
        private float[] accels = new float[3];
        private boolean isReady;
    
        private float[] rot = new float[9];
        private float[] outR = new float[9];
        private float[] inclination = new float[9];
        private float[] values = new float[3];
    
    
    
        /**
        Azimuth: angle between the magnetic north direction and the Y axis, around the Z axis (0 to 359). 0=North, 90=East, 180=South, 270=West
        Pitch: rotation around X axis (-180 to 180), with positive values when the z-axis moves toward the y-axis.
        Roll: rotation around Y axis (-90 to 90), with positive values when the x-axis moves toward the z-axis.
        */
        public abstract void onOrientationChanged(float azimuth, float pitch, float roll);
    }
    

    I tried to figure out how to add gyroscope data, but I am just not doing it right. The google doc at http://developer.android.com/reference/android/hardware/SensorEvent.html shows some code to get a delta matrix from the gyroscope data. The idea seems to be that I'd crank down the filters for the accelerometer and magnetic sensors so that they were really stable. That would keep track of the long term orientation.

    Then, I'd keep a history of the most recent N delta matrices from the gyroscope. Each time I got a new one I'd drop off the oldest one and multiply them all together to get a final matrix which I would multiply against the stable matrix returned by the accelerometer and magnetic sensors.

    This doesn't seem to work. Or, at least, my implementation of it does not work. The result is far more jittery than just the accelerometer. Increasing the size of the gyroscope history actually increases the jitter which makes me think that I'm not calculating the right values from the gyroscope.

    public abstract class SensorsListener3
        implements
            SensorEventListener
    {
        /** The lower this is, the greater the preference which is given to previous values. (slows change) */
        private static final float kFilteringFactor = 0.001f;
        private static final float magKFilteringFactor = 0.001f;
    
    
        public abstract boolean getIsLandscape();
    
        @Override
        public void onSensorChanged(SensorEvent event) {
            Sensor sensor = event.sensor;
            int type = sensor.getType();
    
            switch (type) {
                case Sensor.TYPE_MAGNETIC_FIELD:
                    mags[0] = event.values[0] * magKFilteringFactor + mags[0] * (1.0f - magKFilteringFactor);
                    mags[1] = event.values[1] * magKFilteringFactor + mags[1] * (1.0f - magKFilteringFactor);
                    mags[2] = event.values[2] * magKFilteringFactor + mags[2] * (1.0f - magKFilteringFactor);
    
                    isReady = true;
                    break;
                case Sensor.TYPE_ACCELEROMETER:
                    accels[0] = event.values[0] * kFilteringFactor + accels[0] * (1.0f - kFilteringFactor);
                    accels[1] = event.values[1] * kFilteringFactor + accels[1] * (1.0f - kFilteringFactor);
                    accels[2] = event.values[2] * kFilteringFactor + accels[2] * (1.0f - kFilteringFactor);
                    break;
    
                case Sensor.TYPE_GYROSCOPE:
                    gyroscopeSensorChanged(event);
                    break;
    
                default:
                    return;
            }
    
    
    
    
            if(mags != null && accels != null && isReady) {
                isReady = false;
    
                SensorManager.getRotationMatrix(rot, inclination, accels, mags);
    
                boolean isLandscape = getIsLandscape();
                if(isLandscape) {
                    outR = rot;
                } else {
                    // Remap the coordinates to work in portrait mode.
                    SensorManager.remapCoordinateSystem(rot, SensorManager.AXIS_X, SensorManager.AXIS_Z, outR);
                }
    
                if(gyroUpdateTime!=0) {
                    matrixHistory.mult(matrixTmp,matrixResult);
                    outR = matrixResult;
                }
    
                SensorManager.getOrientation(outR, values);
    
                double x180pi = 180.0 / Math.PI;
                float azimuth = (float)(values[0] * x180pi);
                float pitch = (float)(values[1] * x180pi);
                float roll = (float)(values[2] * x180pi);
    
                // In landscape mode swap pitch and roll and invert the pitch.
                if(isLandscape) {
                    float tmp = pitch;
                    pitch = -roll;
                    roll = -tmp;
                    azimuth = 180 - azimuth;
                } else {
                    pitch = -pitch - 90;
                    azimuth = 90 - azimuth;
                }
    
                onOrientationChanged(azimuth,pitch,roll);
            }
        }
    
    
    
        private void gyroscopeSensorChanged(SensorEvent event) {
            // This timestep's delta rotation to be multiplied by the current rotation
            // after computing it from the gyro sample data.
            if(gyroUpdateTime != 0) {
                final float dT = (event.timestamp - gyroUpdateTime) * NS2S;
                // Axis of the rotation sample, not normalized yet.
                float axisX = event.values[0];
                float axisY = event.values[1];
                float axisZ = event.values[2];
    
                // Calculate the angular speed of the sample
                float omegaMagnitude = (float)Math.sqrt(axisX*axisX + axisY*axisY + axisZ*axisZ);
    
                // Normalize the rotation vector if it's big enough to get the axis
                if(omegaMagnitude > EPSILON) {
                    axisX /= omegaMagnitude;
                    axisY /= omegaMagnitude;
                    axisZ /= omegaMagnitude;
                }
    
                // Integrate around this axis with the angular speed by the timestep
                // in order to get a delta rotation from this sample over the timestep
                // We will convert this axis-angle representation of the delta rotation
                // into a quaternion before turning it into the rotation matrix.
                float thetaOverTwo = omegaMagnitude * dT / 2.0f;
                float sinThetaOverTwo = (float)Math.sin(thetaOverTwo);
                float cosThetaOverTwo = (float)Math.cos(thetaOverTwo);
                deltaRotationVector[0] = sinThetaOverTwo * axisX;
                deltaRotationVector[1] = sinThetaOverTwo * axisY;
                deltaRotationVector[2] = sinThetaOverTwo * axisZ;
                deltaRotationVector[3] = cosThetaOverTwo;
            }
            gyroUpdateTime = event.timestamp;
            SensorManager.getRotationMatrixFromVector(deltaRotationMatrix, deltaRotationVector);
            // User code should concatenate the delta rotation we computed with the current rotation
            // in order to get the updated rotation.
            // rotationCurrent = rotationCurrent * deltaRotationMatrix;
            matrixHistory.add(deltaRotationMatrix);
        }
    
    
    
        private float[] mags = new float[3];
        private float[] accels = new float[3];
        private boolean isReady;
    
        private float[] rot = new float[9];
        private float[] outR = new float[9];
        private float[] inclination = new float[9];
        private float[] values = new float[3];
    
        // gyroscope stuff
        private long gyroUpdateTime = 0;
        private static final float NS2S = 1.0f / 1000000000.0f;
        private float[] deltaRotationMatrix = new float[9];
        private final float[] deltaRotationVector = new float[4];
    //TODO: I have no idea how small this value should be.
        private static final float EPSILON = 0.000001f;
        private float[] matrixMult = new float[9];
        private MatrixHistory matrixHistory = new MatrixHistory(100);
        private float[] matrixTmp = new float[9];
        private float[] matrixResult = new float[9];
    
    
        /**
        Azimuth: angle between the magnetic north direction and the Y axis, around the Z axis (0 to 359). 0=North, 90=East, 180=South, 270=West 
        Pitch: rotation around X axis (-180 to 180), with positive values when the z-axis moves toward the y-axis. 
        Roll: rotation around Y axis (-90 to 90), with positive values when the x-axis moves toward the z-axis.
        */
        public abstract void onOrientationChanged(float azimuth, float pitch, float roll);
    }
    
    
    public class MatrixHistory
    {
        public MatrixHistory(int size) {
            vals = new float[size][];
        }
    
        public void add(float[] val) {
            synchronized(vals) {
                vals[ix] = val;
                ix = (ix + 1) % vals.length;
                if(ix==0)
                    full = true;
            }
        }
    
        public void mult(float[] tmp, float[] output) {
            synchronized(vals) {
                if(full) {
                    for(int i=0; i<vals.length; ++i) {
                        if(i==0) {
                            System.arraycopy(vals[i],0,output,0,vals[i].length);
                        } else {
                            MathUtils.multiplyMatrix3x3(output,vals[i],tmp);
                            System.arraycopy(tmp,0,output,0,tmp.length);
                        }
                    }
                } else {
                    if(ix==0)
                        return;
                    for(int i=0; i<ix; ++i) {
                        if(i==0) {
                            System.arraycopy(vals[i],0,output,0,vals[i].length);
                        } else {
                            MathUtils.multiplyMatrix3x3(output,vals[i],tmp);
                            System.arraycopy(tmp,0,output,0,tmp.length);
                        }
                    }
                }
            }
        }
    
    
        private int ix = 0;
        private boolean full = false;
        private float[][] vals;
    }
    

    The second block of code contains my changes from the first block of code which add the gyroscope to the mix.

    Specifically, the filtering factor for accel is made smaller (making the value more stable). The MatrixHistory class keeps track of the last 100 gyroscope deltaRotationMatrix values which are calculated in the gyroscopeSensorChanged method.

    I've seen many questions on this site on this topic. They've helped me get to this point, but I cannot figure out what to do next. I really wish the Sensor Fusion guy had just posted some code somewhere. He obviously had it all put together.

    • Stan
      Stan over 11 years
      According to the "Professional Android Sensor Programming" book, InvenSense's Sensor Fusion algorithms are proprietary, so it's hardly possible to find the source code in public access. The library is included in most modern devices on system level, so that SENSOR.TYPE_ROTATION already provides measurements with regard to gyro-based short-time correction. I think the most elaborated public sources for the matter is this. I'm not sure if it's a good replacement.
    • zapl
      zapl over 11 years
      There are also several academic papers related to sensor fusion using kalman filters. They typically don't contain source code but should have the technical and mathematical details you need. scholar.google.com
    • Hoan Nguyen
      Hoan Nguyen over 11 years
      Why do you low pass filtering the magnetic values?
  • Max
    Max about 10 years
    Is there any chance one can access this data on a native side? Most of things are not exposed via NDK in sensor.h
  • Burf2000
    Burf2000 over 9 years
    Did you ever get this all working, mine is still jumpy and I have a low pass filter
  • chintan s
    chintan s about 9 years
    Hello Edward, this is a very good explanation, something I have been trying to find for some time. Can you please tell why is it required to use SensorManager.remapCoordinateSystem? Thanks.
  • Edward Falk
    Edward Falk about 9 years
    The sensor system on Android doesn't know or care how your device is rotated. Every device has its own coordinate system with +X to the right, +Y up, and +Z toward you. If the device is rotated, e.g. a phone in landscape orientation, you need to transform sensor coordinates into values that make sense for the way you're holding the device. What exactly that means depends on your application. remapcoordinateSystem() is a helper function that can be useful here. I have some more notes at efalk.org/Docs/Android/sensors_0.html#remapCoordinateSystem
  • chintan s
    chintan s about 9 years
    Thanks your for your reply Edward. Just a last question. I just looked at TYPE_GRAVITY output and I must say it is very good. So in your code, do I need to filter the final pitch output? Or do I need to filter the TYPE_MAGNETIC_FILED before it is used? Thanks.
  • Edward Falk
    Edward Falk about 9 years
    I would write your application without filtering and then see if the results are stable enough for you. If not, try applying some filtering. GRAVITY probably doesn't need filtering, but MAGNETIC_FIELD probably does. You should also consider using ROTATION and converting the quaternion to a matrix. ROTATION pretty much does everything for you. Converting to a matrix is almost trivial.
  • chintan s
    chintan s about 9 years
    Thank your for your reply. So I wrote the code to combine GRAVITY and MAGNETIC_FIELD for orientation estimation and the output is more stable when I filter the MAGNETIC_FIELD. I assume that the quaternion obtained from ROTATION is device orientation. So in that case, there is no need for me to obtain the orientation, is it? Thanks
  • Edward Falk
    Edward Falk about 9 years
    Yes, the Quaternion really has everything you want; it's just a matter of converting it to a matrix to convert device coordinates to world coordinates or vice versa.
  • chintan s
    chintan s about 9 years
    Thank you Edward, that really helped :)
  • chintan s
    chintan s about 9 years
    Hello Edward, can you please look at this problem that I have. Thank you. stackoverflow.com/questions/30529272/…
  • gaborous
    gaborous over 8 years
    Here's a sensor fusion library that may be of interesting.
  • Edward Falk
    Edward Falk over 8 years
    Thanks; I just might give it a try.
  • Edward Falk
    Edward Falk almost 8 years
    I just found this superb video on the topic: youtube.com/watch?v=C7JQ7Rpwn2k
  • Sam
    Sam about 7 years
    One disadvantage of this you can get crazy output from some of these built-in virtual sensors in certain scenarios. For example, on my Samsung Galaxy S5, all the virtual/composite sensors produce inverse results for a few seconds after you give the device a hard swing/shake when the polling period of 55000 microseconds or higher. Another example with the same device: once it went into a weird state where the sensor readings of the virtual sensors were permanently backwards until the device was restarted. In both cases, the raw (non-virtual) sensor values were fine.