Salesforce integration with AWS AppFlow, S3, Lambda and SQS

Salesforce integration with AWS AppFlow, S3, Lambda and SQS
Reading Time: 5 minutes

Salesforce and Amazon’s Web Services (AWS) are two powerful software development and cloud computing tools. In this post, we’ll discuss how these two tools can be integrated for an optimized and efficient workflow.

The integration of Salesforce and AWS allows businesses to take advantage of the scalability, reliability, and security of both platforms. The integration enables businesses to quickly and efficiently move key data and applications between the cloud platforms and reduces the complexity of integration.

There are many ways to sync up our Salesforce data with third parties in realtime. One option is a mix of Salesforce and AWS services, specifically Change Data Capture from Salesforce and AppFlow from AWS. We are going to build a Cloudformation yml file with all that we need to deploy our integration on any AWS environment. However can be a good option to do it first by point and click through the AWS console and then translate it into a Cloudformation template.

About Salesforce Change Data Capture

Receive near-real-time changes of Salesforce records, and synchronize corresponding records in an external data store.

Change Data Capture

publishes change events, which represent changes to Salesforce records. Changes include creation of a new record, updates to an existing record, deletion of a record, and undeletion of a record.

Important:
Change Data Capture does not support relationships at the time this post was written (08/2021). This means you will only be able to sync up beyond your object unless you implement some tricks using Process Builder and Apex. That’s out of the scope of this post and we are going to see it in a different one because requires some extra steps and knowledge.

To start listening on specific object go to Setup -> Integrations -> Change Data Capture. Move the object you want to the right.

Advantages of using AppFlow approach

  • Data being transferred securely
  • Credentials are managed by Oauth process
  • No coding required unless you want to run some specific logic for every sync up
  • 100% serverless, pay as use

Disadvantages of using AppFlow approach

  • The connection must exist before deploying the infrastructure. This is a manual step
  • This approach can take some time to learn and configure, specially if you are already familiar with callouts from Salesforce

Requirements for Salesforce

  • Your Salesforce account must be enabled for API access. API access is enabled by default for the Enterprise, Unlimited, Developer, and Performance editions.
  • Your Salesforce account must allow you to install connected apps. If this functionality is disabled, contact your Salesforce administrator. After you create a Salesforce connection in Amazon AppFlow, verify that the connected app named Amazon AppFlow Embedded Login App is installed in your Salesforce account.
  • The refresh token policy for the Amazon AppFlow Embedded Login App must be set to Refresh token is valid until revoked. Otherwise, your flows will fail when your refresh token expires.
  • You must enable change data capture in Salesforce to use event-driven flow triggers.
  • If your Salesforce app enforces IP address restrictions, you must grant access to the addresses used by Amazon AppFlow.
  • To create private connections using AWS PrivateLink, you must enable both Manager Metadata and Manage External Connections user permissions in your Salesforce account. Private connections are currently available in the us-east-1 and us-west-2 AWS Regions.

Architecture for the solution

Let say we want to listen to changes on Account object. Every time a new Account is created or updated there will be an event to AppFlow through Salesforce Data Capture.

We could add some logic in the Lambda function to decide if we are interested in that change or not.

How to create the Salesforce Oauth Connection

As we said, an Oauth connection must exist before deploying our stack to AWS. This is something we have to create by hand. If we deal with different environments in AWS, we can create as many connection as we want pointing to our different Salesforce instances.

  • Open your AWS console and go to Amazon App Flow
  • Go to View Flows and click on Connections
  • Click on Create Connection. Select production in case you have a dev org. Provide a connection name
  • Once you click on Continue, a Salesforce popup will be open. Put your Salesforce credentials to login
  • After that your connection will be created and available to use

The Cloudformation template

      
# Commands to deploy this through SAM CLI
#  sam build
#  sam deploy --no-confirm-changeset

AWSTemplateFormatVersion: 2010-09-09
Description: >-
  app flow lambda + s3 + SQS

Transform:
  - AWS::Serverless-2016-10-31

Parameters:
  Environment:
    Type: String
    Description: Environment name. Example, dev,staging,testing, etc

Globals:
  Function:
    Runtime: nodejs12.x
    Timeout: 30
    MemorySize: 128


Resources:
  MyLambda:
    Type: AWS::Serverless::Function
    DependsOn:
      - "MyQueue"
    Properties:
      Handler: src/handlers/my.handler
      Description: Sync up lambda
      Environment:
        Variables:
          QueueURL:
            Ref: "MyQueue"
          MyBucket: !Sub "${AWS::AccountId}-${Environment}-my-bucket"
      Role:
        Fn::GetAtt:
          - "MyLambdaRole"
          - "Arn"
    Tags:
      Name: !Sub "${Environment}-my-lambda"

  MyLambdaRole:
    Type: AWS::IAM::Role
    Properties:
      AssumeRolePolicyDocument:
        Statement:
          - Effect: Allow
            Action: "sts:AssumeRole"
            Principal:
              Service:
                - "lambda.amazonaws.com"
        Version: "2012-10-17"
      ManagedPolicyArns:
        - arn:aws:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole
      Policies:
        - PolicyName: AccessOnMyQueue
          PolicyDocument:
            Version: 2012-10-17
            Statement:
              - Effect: Allow
                Action: "sqs:SendMessage"
                Resource:
                  - Fn::GetAtt:
                      - "MyQueue"
                      - "Arn"
        - PolicyName: AccessToS3Notifications
          PolicyDocument:
            Version: 2012-10-17
            Statement:
              - Effect: Allow
                Action:
                  - 's3:GetBucketNotification'
                Resource: !Sub 'arn:aws:s3:::${AWS::AccountId}-${Environment}-my-bucket'
        - PolicyName: AccessOnS3Objects
          PolicyDocument:
            Version: 2012-10-17
            Statement:
              - Effect: Allow
                Action:
                  - "s3:GetObject"
                Resource: !Sub 'arn:aws:s3:::${AWS::AccountId}-${Environment}-my-bucket/*'


  MyBucket:
    Type: AWS::S3::Bucket
    DependsOn:
      - MyLambda
    Properties:
      BucketName: !Sub "${AWS::AccountId}-${Environment}-my-bucket"
      NotificationConfiguration:
        LambdaConfigurations:
          - Event: 's3:ObjectCreated:*'
            Function: !GetAtt MyLambda.Arn
      LifecycleConfiguration:
        Rules:
          - Id: ExpirationInDays
            Status: 'Enabled'
            ExpirationInDays: 3
          - Id: NoncurrentVersionExpirationInDays
            Status: 'Enabled'
            NoncurrentVersionExpirationInDays: 3

  MyBucketPolicy:
    Type: AWS::S3::BucketPolicy
    DependsOn: MyBucket
    Properties:
      Bucket: !Ref MyBucket
      PolicyDocument:
        Version: '2008-10-17'
        Statement:
          - Effect: Allow
            Principal:
              Service: appflow.amazonaws.com
            Action:
              - s3:PutObject
              - s3:AbortMultipartUpload
              - s3:ListMultipartUploadParts
              - s3:ListBucketMultipartUploads
              - s3:GetBucketAcl
              - s3:PutObjectAcl
            Resource:
              - !Sub "arn:aws:s3:::${AWS::AccountId}-${Environment}-my-bucket"
              - !Sub "arn:aws:s3:::${AWS::AccountId}-${Environment}-my-bucket/*"

  MyQueue:
    Type: AWS::SQS::Queue
    Properties:
      QueueName: !Sub "${Environment}-my-queue.fifo"
      FifoQueue: true
      ContentBasedDeduplication: true
      RedrivePolicy:
        deadLetterTargetArn:
          Fn::GetAtt:
            - "MyDeadLetterQueue"
            - "Arn"
        maxReceiveCount: 2

  MyDeadLetterQueue:
    Type: AWS::SQS::Queue
    Properties:
      QueueName: !Sub "${Environment}-my-queue-dlq.fifo"
      FifoQueue: true
      MessageRetentionPeriod: 1209600 # 14 days (the max supported)

  MyQueuePolicy:
    DependsOn:
      - "MyQueue"
    Type: AWS::SQS::QueuePolicy
    Properties:
      PolicyDocument:
        Version: "2012-10-17"
        Statement:
          - Effect: Allow
            Principal:
              Service:
                - "events.amazonaws.com"
                - "sqs.amazonaws.com"
            Action:
              - "sqs:SendMessage"
              - "sqs:GetQueueUrl"
              - "sqs:DeleteMessage"
              - "sqs:ReceiveMessage"
            Resource:
              Fn::GetAtt:
                - "MyQueue"
                - "Arn"
      Queues:
        - Ref: "MyQueue"

  # AppFlow flow to connect SFDC and AWS
  MyAppFlow:
    Type: AWS::AppFlow::Flow
    Properties:
      FlowName: !Sub "${Environment}-my-app-flow"
      Description: Flow to sync up with Salesforce
      TriggerConfig:
        TriggerType: Event
      SourceFlowConfig:
        ConnectorType: Salesforce
        ConnectorProfileName: !Sub "${Environment}-my-connection" # the name of the Oauth connection created in AWS console
        SourceConnectorProperties:
          Salesforce:
            Object: Account__ChangeEvent
            EnableDynamicFieldUpdate: false
            IncludeDeletedRecords: true
      DestinationFlowConfigList:
        - ConnectorType: S3
          DestinationConnectorProperties:
            S3:
              BucketName: !Ref MyBucket
              S3OutputFormatConfig:
                AggregationConfig:
                  AggregationType: None
                PrefixConfig:
                  PrefixFormat: MINUTE
                  PrefixType: FILENAME
                FileType: JSON
      Tasks:
        - TaskType: Filter
          ConnectorOperator:
            Salesforce: PROJECTION
          SourceFields:
            - Name
        - TaskType: Map
          SourceFields:
            - Name
          TaskProperties:
            - Key: SOURCE_DATA_TYPE
              Value: Name
            - Key: DESTINATION_DATA_TYPE
              Value: Name
          DestinationField: Name

      
    

Debugging

It’s important we have a way to troubleshoot in case things go wrong. Since this integration deals with different AWS services, we have to see what we have available in each one.

  • AppFlow run history
  • CloudWatch for our Lambda
  • Spy on S3 to see objects created
  • Spy on SQS messages created (monitor tab)

Resources

,

About the author

Andrés Canavesi
Andrés Canavesi

Software Engineer with 15+ experience in software development, specialized in Salesforce, Java and Node.js.


Join 22 other subscribers

Leave a Reply