Start SSH agent inside container
In order to be able to SSH to the GET-created hosts from within the container, I would like to have my private SSH key loaded into the SSH agent in the container.
It should be possible to add this into the container, either by forwarding the host's agent, or by starting ssh-agent with a link to the private key from the host.
In devcontainer.json, mount the host's SSH_AUTH_SOCK:
# ...
"mounts": [
# other mounts...
# ...
"type=bind, source=${localEnv:SSH_AUTH_SOCK},target=/tmp/1password-agent.sock"
],
# ...
and then use it within the container's environment:
# ...
"containerEnv": {
"SSH_AUTH_SOCK": "/tmp/1password-agent.sock"
},
"postCreateCommand": ".devcontainer/postCreateCommand.sh",
# ...
The postCreateCommand.sh can add a step for loading SSH:
#!/usr/bin/env bash
ansible-playbook .devcontainer/get/setup.yml
.devcontainer/get/gcloud-auth.sh
.devcontainer/get/get/ssh/agent.sh` # ← use forwarded agent or fall back
And a new agent.sh would be something like this:
#!/bin/bash
set -e
echo "Setting up SSH agent..."
# Check if agent forwarding is working
if ssh-add -l > /dev/null 2>&1; then
echo "SSH agent forwarding is active"
exit 0
fi
# Agent not running.
# Check if SSH_AUTH_SOCK exists but agent isn't responding
if [ -n "$SSH_AUTH_SOCK" ] && [ -S "$SSH_AUTH_SOCK" ]; then
echo "SSH_AUTH_SOCK exists but agent not responding"
fi
# Start local SSH agent if forwarding failed
eval $(ssh-agent -s)
echo "export SSH_AUTH_SOCK=$SSH_AUTH_SOCK" > ~/.ssh-agent-env
echo "export SSH_AGENT_PID=$SSH_AGENT_PID" >> ~/.ssh-agent-env
# Try to add default key if it exists
KEYS=${GET_KEYS:-$WORKSPACE/keys}/gitlab.ssh
if [ -f ${KEYS} ]; then
ssh-add ${KEYS}
echo "Local SSH agent started"
else
echo "Failed to start SSH agent: no keys found at ${KEYS}"
fi
Finally, in ~/.bashrc, can source the SSH agent environment
if ! ssh-add -l > /dev/null 2>&1 && [ -f ~/.ssh-agent-env ]; then
source ~/.ssh-agent-env
fi
Edited by Mike Lockhart | GitLab