Uninitialized value was created by a heap allocation

45,846

Solution 1

In the updated code you use an uninitialized pointer:

dictionary = malloc(sizeof(node*) * LISTS);

// .... code that does not change dictionary[i] for any i

new_node->next = dictionary[index];   // use uninitialized pointer

As people had wrote already, this will only work if you had pre-set all the pointers to be NULL before entering this loop:

dictionary = malloc(sizeof(node*) * LISTS);
if ( !dictionary ) {
    return false;
}

for (size_t i = 0; i < LISTS; ++i) {
    dictionary[i] = NULL;
}

Solution 2

The heap allocation you assign to dictionary uses malloc which does not initialize the returned bytes. So dictionary in the code you've posted ends up being an array of uninitialized pointers. Presumably you go on to use those pointers in some way which valgrind knows to be an error.

An easy way to fix this is to use calloc instead of malloc, because it zeros the returned bytes for you. Or, use memset to zero the bytes yourself.

Share:
45,846
Core_dumped
Author by

Core_dumped

Updated on January 12, 2022

Comments

  • Core_dumped
    Core_dumped over 2 years

    I'm trying to implement a dictionary of words using a hash table, so I need to have it global, and in one of my header files I declare it

    extern node** dictionary;
    

    Where node is

    typedef struct node
    {
        char* word;
        struct node* next;
    } node;
    

    Then in another file in which functions are defined I include the header which has the dictionary declaration, and also I add at the top

    node** dictionary;
    

    Then in the function which actually loads the dictionary I first allocate memory for the linked lists which will make the hash table

    bool load(const char* dict_file)
    {
        dictionary = malloc(sizeof(node*) * LISTS);
    
        FILE* dict = fopen(dict_file, "r");
    
        if(dict == NULL)
            return false;
    
        char buffer[MAX_LEN + 2];
    
        size_dict = 0;
    
        while(fgets(buffer, MAX_LEN + 2, dict) != NULL)
        {
            node* new_node = malloc(sizeof(node));
    
            int len = strlen(buffer);
    
            new_node->word = malloc(sizeof(char) * (len));
    
            //avoid \n
            for(int i = 0; i < len - 1; i++)
                new_node->word[i] = buffer[i];
    
            new_node->word[len - 1] = '\0';
    
            new_node->next = NULL;
    
            int index = hash(buffer);
    
            new_node->next = dictionary[index];
    
            dictionary[index] = new_node;
    
            size_dict++;
        }
    
        if (ferror(dict))
        {
            fclose(dict);
            return false;
        }
    
        fclose(dict);
        return true;
    }
    

    So the program works fine, I then free all the allocated memory for strings and nodes and when I run valgrind(a debugger which detects memory leaks) it says no memory leaks are possible, but it says that there is an error Uninitilised value was created by a heap allocation and redirects me to that exact line where I'm allocating memory for dictionary the exact first line of the load function which I've written above.
    What am I doing wrong? I guess the way I use dictionary globally is wrong, so can anybody suggest some other way of keeping it global and avoid this error?