I’m getting the error below when I’m writing this
ansible -m ping all
host1 | UNREACHABLE! => {
"changed": false,
"msg": "Failed to connect to the host via ssh: Permission denied (publickey,password).\r\n",
"unreachable": true
}
I have gone through all the steps in this tutorial and now I’m stuck at step 3. https://www.digitalocean.com/community/tutorials/how-to-install-and-configure-ansible-on-ubuntu-16-04
Any help is appreciated.
These answers are provided by our Community. If you find them useful, show some love by clicking the heart. If you run into issues leave a comment, or add your own answer to help others.
ssh <user>@<target>
If this works ssh is enabled, move on to the next step.
ansible-playbook -i inventory.yaml --private-key=/home/<user_id>/.ssh/id_rsa <playbook>.yaml -u <user_id>
If this fails with the following error, steps listed after the error should fix:
TASK [Gathering Facts] *********************************************************
fatal: [x-host.com]: UNREACHABLE! => {"changed": false, "msg": "Authentication failed.", "unreachable": true}
to retry, use: --limit
PLAY RECAP *********************************************************************
x-host.com : ok=0 changed=0 unreachable=1 failed=0
On the remote host, move/rename the ~/.ssh/authorized_keys
file.
Then from the ansible master run the following:
ssh-copy-id -i ~/.ssh/id_rsa.pub <user-id>@<x-host>
// ssh-copy-id — use locally available keys to authorise logins on remote machine
This creates an authorized_keys file in the remote server ~/.ssh
folder for the <user-id> user.
Now run the ansible playbook and should work.
This error usually occurs when their is no valid public key set for the target server.
First check whether you have valid public and private key generated for the user which you are using. Keep public key in target server and private key in control server Get the path for private key and configure it in ansible.cfg file(check ansible docs for this)
This could happen even if you have made sure the passwordless ssh between System A and System B (say using either ssh-copy-id command or by manually copying the public key i.e content of the id_rsa.pub file on System A to .ssh/authorized_keys file on System B. If this is happening, one of the reason could be the user home directories.
On System A the user home directory is say /home/tester and on System B, it is /users/tester, then passwordless ssh might not work. Make sure both users have same home directory solves this issue. I observed this case in CentOS machines and on making sure the home directories for users same, the issue resolved.
Since you can connect directly, your ansible is defaulting to a different key. Create/edit your ansible.cfg file in your playbook directory and add a line for the location of your key:
[defaults] private_key_file = /Users/username/.ssh/private_key
Did you work through the prerequisite tutorial listed at the beginning of the Ansible installation article? Specifically step 4.
This one: Initial Server Setup with Ubuntu 16.04
It looks like you don’t have your user’s public SSH key present on the remote droplet that Ansible uses to authenticate? I’m not sure however.
Hi there! What happens if you try to connect to your server via SSH?
ssh root@your_server_ip
Hi,
I’m trying to automate adding new servers via Ansible. So I would like to use root private-key instead of root password
right now i use this commande : * ansible all -m user -a “name=myapp state=present password={{ user_password }}” -u root -k*
It prompt me for root password
I would like to user the following one :
ansible all -m user -a “name=myapp state=present password={{ user_password }}” -u root --private-key=/root/.ssh/id_rsa
But I can’t use this one because my my ansible user do not have privilege to access the root private key
Can anyone have solution please
NB: root pub key is already copied to the node server.