Setting up password-less ssh between servers

Requirement: Hadoop deployments need to be set up so that the target server can start up slave machines.

Setting up ssh without passwords:
  1. Make sure that you have an ssh client is running on the server. If necessary, install openssh-clientl. You'll also need to install the ssh server:
    sudo apt-get install openssh-server

  2. Ensuer that the default ssl port 22 is open on all machines. You can quickly test this by using telnet to connect to each machine like so:
    telnet <server-hostname> 22
    Which should connect successfully.

  3. Next we need to create the ssh key. Use:
    ssh-keygen -t rsa -P "" -f ~/.ssh/id_dsa
    -t Sets the encryption protocol
    -P Sets the passphrase (in this case blank
    -f Specifies the filename

  4. Copy the generated key file to all the slaves (replace username appropriately as the user starting the Hadoop daemons). Will be prompted for the password.

  5. ssh-copy-id -i $HOME/.ssh/id_dsa.pub username@slave-hostname

  6. If the server also acts as a slave ('ssh localhost' should work without a password)
    at $HOME/.ssh/id_dsa.pub >> $HOME/.ssh/authorized_keys 
    If HDFS/MapReduce are run as different users then the steps (3,4 and 5) have to be repeated for all the users.