Setting up password-less ssh between servers
Requirement: Hadoop deployments need to be set up so that the target server can start up slave machines.
Setting up ssh without passwords:- Make sure that you have an ssh client is running on the server.
If necessary, install openssh-clientl. You'll also need to install the ssh server:
sudo apt-get install openssh-server
- Ensuer that the default ssl port 22 is open on all machines. You can quickly test this by using telnet to connect to each machine like so:
telnet <server-hostname> 22
Which should connect successfully. - Next we need to create the ssh key. Use:
ssh-keygen -t rsa -P "" -f ~/.ssh/id_dsa
-t Sets the encryption protocol
-P Sets the passphrase (in this case blank
-f Specifies the filename
- Copy the generated key file to all the slaves (replace username appropriately as the user starting the Hadoop daemons). Will be prompted for the password.
-
ssh-copy-id -i $HOME/.ssh/id_dsa.pub username@slave-hostname
- If the server also acts as a slave (`ssh localhost` should work without a password)
at $HOME/.ssh/id_dsa.pub >> $HOME/.ssh/authorized_keys
If HDFS/MapReduce are run as different users then the steps (3,4 and 5) have to be repeated for all the users.