How to convert my binary (hex) data to latitude and longitude?

13,617

I have reordered your data so that we first have 3 longitures and then 3 latitudes:

74.438085, 74.438338, 74.669105, 43.004605, 42.938628, 42.993772

This is the best fit of the hexadecimals i can come up with is:

74.437368, 74.439881, 74.668392, 42.993224, 42.961388, 42.982391

The differences are: -0.000717, 0.001543, -0.000713, -0.011381, 0.022760, -0.011381

The program that generates these values from the complete Hex'es (4 not 3 bytes) is:

int main(int argc, char** argv) {
    int a[] = { 0x4adac812, 0x4adaee12, 0x4ae86d11, 0x2b6059f9, 0x2a3c8df9, 0x2afd0efb };
    int i = 0;
    while(i<3) {
        double b = (double)a[i] / (2<<(3*8)) * 8.668993 -250.0197;
        printf("%f\n",b);
        i++;
    }
    while(i<6) {
        double b = (double)a[i] / (2<<(3*8)) *  0.05586007 +41.78172;
        printf("%f\n",b);
    i++;
    }
    printf("press key");
    getch();
}
Share:
13,617
DarkSerg
Author by

DarkSerg

Updated on June 04, 2022

Comments

  • DarkSerg
    DarkSerg over 1 year

    I have some binary data stream which passes geo location coordinates - latitude and longitude. I need to find the method they are encoded.

    4adac812 = 74°26.2851' = 74.438085
    2b6059f9 = 43°0.2763'  = 43.004605
    
    4adaee12 = 74°26.3003' = 74.438338
    2a3c8df9 = 42°56.3177' = 42.938628
    
    4ae86d11 = 74°40.1463' = 74.669105
    2afd0efb = 42°59.6263' = 42.993772
    

    1st value is hex value. 2nd & 3rd are values that I get in output (not sure which one is used in conversion).

    I've found that first byte represents integer part of value (0x4a = 74). But I cannot find how decimal part is encoded.

    I would really appreciate any help!

    Thanks.

    --

    Upd: This stream comes from some "chinese" gps server software through tcp protocol. I have no sources or documentation for clent software. I suppose it was written in VC++6 and uses some standard implementations.

    --

    Upd: Here is packets I get:

    Hex data:
    41 00 00 00  13 bd b2 2c
    4a e8 6d 11  2a 3c 8d f9
    f6 0c ee 13
    
    Log data in client soft:
    [Lng] 74°40.1463', direction:1
    [Lat] 42°56.3177', direction:1
    [Head] direction:1006, speed:3318, AVA:1
    [Time] 2011-02-25 19:52:19
    
    Result data in client (UI):
    74.669105
    42.938628
    Head 100 // floor(1006/10)
    Speed 61.1 // floor(3318/54.3)
    
    
    41 00 00 00  b1 bc b2 2c
    4a da ee 12  2b 60 59 f9
    00 00 bc 11
    [Lng] 74°26.3003', direction:1
    [Lat] 43°0.2763', direction:1
    [Head] direction:444, speed:0, AVA:1
    [Time] 2011-02-25 19:50:49
    74.438338
    43.004605
    
    
    00 00 00 00  21 bd b2 2c
    4a da c8 12  aa fd 0e fb
    0d 0b e1 1d
    [Lng] 74°26.2851', direction:1
    [Lat] 42°59.6263', direction:1
    [Head] direction:3553, speed:2829, AVA:1
    [Time] 2011-02-25 19:52:33
    74.438085
    42.993772
    

    I don't know what first 4 bytes mean.

    I found the lower 7 bits of 5th byte represent the number of sec. (maybe 5-8 bits are time?) Byte 9 represent integer of Lat.

    Byte 13 is integer of Lng.

    Bytes 17-18 reversed (word byte) is speed.

    Bytes 19-20 reversed is ava(?) & direction (4 + 12 bits). (btw, somebody knows what ava is?)

    And one note. In 3rd packet 13th byte you can see only lower 7 bits are used. I guess 1st bit doesnt mean smth (I removed it in the beginning, sorry if I'm wrong).