What is the difference between init 6 and reboot on Red Hat / RHEL / CentOS?

299

There is no difference in them. Internally they do exactly the same thing:

 1. reboot uses the shutdown command (with the -r switch). The shutdown command used to 
    kill all the running processes, unmount all the file systems and finally tells the
    kernel to issue the ACPI power command.

 2.init 6 tells the init process to shutdown all of the spawned processes/daemons as
   written in the init files (in the inverse order they started) and lastly invoke the 
   shutdown -r now command to reboot the machine
Share:
299

Related videos on Youtube

ujjain
Author by

ujjain

Updated on September 18, 2022

Comments

  • ujjain
    ujjain over 1 year

    I tried programming a neural network in processing IDE. I managed to do it quite well, until I tried using the MNIST handwritten digits data set. I tried the iris data set and few others from UCI machine learning repository, but when I used the MNIST data set it didn't worked. for some reason all of the outputs approached zero with time, and that caused the total error to be always equal to 1. I am almost sure that my problem is the activation function; so I tried using softmax for classification, but it wasn't very successful. I got the same results. I think maybe I should have use a different loss function, so I tried the negative log probability according to this video. the results now are the same cost value for each output neuron, and the sum of the outputs is not 1 as it should be. Here are the functions for each part of the code that I have changed (I prefer not to share the full code because it's long and messy, and not really helpful):

    softmax:

    float[] softmax(float[] inputVector){
      float[] result = new float[inputVector.length];
      float sigma = 0;
      for(int i = 0; i < inputVector.length; i++){
        sigma += exp(inputVector[i]);
      }
      for(int i = 0; i < result.length; i++){
        result[i] = exp(inputVector[i]) / sigma;
      }
      return result;
    }
    

    derivative of softmax:

    float[] derivativeSoftmax(float[] inputVector){
      float[] result = new float[inputVector.length];
      for(int i = 0; i < result.length; i++){
        result[i] = softmax(inputVector)[i] * (1 - softmax(inputVector)[i]);
      }
      return result;
    }
    

    loss function:

    for(int i = 0; i < outputNeuronsNumber; i++){
      float tempSigma = 0;
      for(int j = 0; j < outputNeuronsNumber; j++){
        tempSigma += target[diffCounter2] * log(outputLayer[j]);
      }
      cost[i] = -tempSigma;
    }
    

    I can't see what is the problem with my code.

  • Admin
    Admin almost 6 years
    according to the link you sent, maybe it will be easier to rewrite the normal softmax function, and by that the derivative will be correct? and why does all the cost values are the same?
  • Ryan
    Ryan almost 6 years
    I'm not entirely sure what you mean by rewriting the original softmax function, it is correct. Sometimes cost values are all the same because somewhere you are doing scalar addition/multiplication where you should be doing vector addition/multiplication. This is just one of many possibilities, but "having all your outputs be the same" is a pretty good indicator of this bug.
  • harshavmb
    harshavmb about 2 years
    Wish you mentioned, the source of above copied content from here
  • krad
    krad about 2 years
    That is not the same. Init 6 will run all the relevant shutdown scripts for the services. reboot may or may not do this depending on the OS and distribution. (generally safe on modern linux machines) Killing process vs running shutdowns my leave certain apps in an inconstant state, which may prevent restart on the next boot. Especially if a kill -9 is used. Just a warning from a seasoned unix bod