Back to blog
Sep 08, 2023
4 min read
Cover Image

Use Steampipe to select your AWS resources across SSO accounts with SQL

See how this setup can help to discover your AWS resources across all SSO accounts with a mix of Steampipe, docker, bash scripts, and AWS CLI

Use case

You are in charge of several AWS accounts within an AWS Organisation and need to check the resources across these accounts. E.g., to check which are the configured runtimes for the lambdas.

Approach with Steampipe

Steampipe is a tool to query data from different providers. Among others, there is a plugin for AWS.

The big plus is that Steampipe provides the ability to query more than one account with a query with aggregator connection

This is how the result will look like for my AWS SSO accounts.

query result

More about Steampipe and AWS:


It’s necessary to have a link between AWS CLI profiles and Steampipe connection for the AWS SSO accounts that can be recreated without any effects on the local setup, which is created inside a docker image. This docker image is based on Steampipe with additional installation of some tools, the AWS CLI and the AWS steampipe plugin.



# Setup prerequisites (as root)
USER root:0
RUN apt-get update -y \
 && apt-get install -y git curl unzip jq

RUN curl "" -o "" \
 && unzip \
 && ./aws/install \
 && rm -rf ./aws

# Install the aws and steampipe plugins for Steampipe (as steampipe user).
USER steampipe:0
RUN  steampipe plugin install steampipe aws

The Steampipe docu is here:

After creating the image with docker build -t steampipe-query .. The container can be created with the following command.

docker run --entrypoint /bin/bash -it \
--mount type=bind,source="${PWD}/queries",target=/workspace/queries \
--mount type=bind,source="${PWD}/scripts",target=/workspace/scripts \
--mount type=bind,source="${PWD}/.env",target=/workspace/.env \
--name steampipe-query \

These are the commands to use the container again docker start -a steampipe-query and docker exec -it steampipe-query /bin/bash.


One of the mount points was the folder queries, which contain, in this example, the SQL to check the lambda runtime.

  _ctx ->> 'connection_name' as connection_name,
  SUM(COUNT(*)) OVER() AS total_count
where runtime not in ('nodejs18.x', 'nodejs16.x', 'python3.9')
group by
order by

The command to run this query is steampipe query queries/lambda-runtime.sql. This will work after the scripts have created the profiles and connections config.


The other mount points are scripts and the env file. The first step is to set the needed env variable values and then run the script ./scripts/ inside the container, which creates the file ~/.aws/config with SS0 session values.

SSO_START_URL= # https://<your-aws-account-id>
SSO_SESSION_NAME= # <your session name, it's just a name>
SSO_REGION= # <your region, e.g. us-east-1>

As next step source the env file with source .env to get the value for the session name. Than run the login to aws sso with the command aws sso login --sso-session $SSO_SESSION_NAME.

It will look like this.

sso login

Open the link in the browser and put in the code.

authorize request

Then, allow the access.

allow sso to access data

successfully logged in

After it’s confirmed, you can create profiles with the script ./scripts/ inside the container. This will create a profile for each account in the aws config file ~/.aws/config (after confirmation) with a suffix of the assigned roles for the accounts.

The scipt is adapted from this gist:

The last step for the setup is to create the connections for Steampipe with the script ./scripts/ inside the container. This will create a connection for each profile in the AWS config file ~/.aws/config. Not every role is allowed to query the data, so it’s necessary to set the env variable ALLOWED_ROLES with the roles allowed to query the data. The roles are comma-separated. E.g.


And now it’s possible to run the queries with steampipe 🥳