Salesforce Output
    • Dark
      Light

    Salesforce Output

    • Dark
      Light

    Article Summary

    Salesforce Output

    The Salesforce Output component uses the Salesforce API to write back the contents of a source table (or view) into a table in Salesforce.

    Properties

    The table below cites the Salesforce Output component's setup properties, including any actions required of the user.

    Warning: this component is potentially destructive. The output operations performed by this component can delete, overwrite, and truncate target objects within Salesforce, and these operations may be irreversible.

    Snowflake Properties

    PropertySettingDescription
    NameStringInput a human-readable name for the component.
    Authentication MethodSelectSelect the authentication method. Users can choose between a username/password combination or an OAuth.
    Use SandboxSelectNo: connect to a live Salesforce account. This is the default setting.
    Yes: connect to a sandbox Salesforce account.
    This property is only available when Authentication Method is set to "User/Password".
    UsernameStringProvide a valid Salesforce username.
    This property is only available when Authentication Method is set to "User/Password".
    PasswordStringProvide a valid password corresponding to the Salesforce username. Users can store passwords in the component; however, use of the Password Manager feature is recommended.
    This property is only available when Authentication Method is set to "User/Password".
    Security TokenStringProvide a valid Salesforce security token.
    This property is only available when Authentication Method is set to "User/Password".
    AuthenticationSelectSelect an OAuth entry to authenticate this component. An OAuth entry must be set up in advance. To learn how to create and authorise an OAuth entry, please read our Salesforce Output Authentication Guide.
    This property is only available when Authentication Method is set to "OAuth".
    Use Bulk APISelectNo: write up to 200 rows in real-time. This is the default setting.
    Yes: write up to 10,000 rows asynchronously in the background—this cannot be cancelled before completion.
    Content TypeSelectSelect the content type. When set to ZIP-XML (the default), data is loaded to Salesforce and the Content Type in the Salesforce UI is shown as ZIP-XML. This applies accordingly to each other content type. This parameter is only available when Use Bulk API is set to Yes.
    Connection OptionsParameterA JDBC parameter supported by the database driver—manual setup is not usually required, since sensible defaults are assumed.
    Available parameters are explained in the Data Model.
    ValueA value for the given parameter.
    DatabaseSelectSelect a Snowflake database. The special value, [Environment Default], is the default setting, and uses the database defined in the Matillion ETL environment.
    SchemaSelectSelect the source schema of the local data to be output to Salesforce. The special value, [Environment Default], is the default setting, and uses the schema defined in the Matillion ETL environment.
    Source TableSelectSelect the source table from which data will be unloaded (output). The tables available in the dropdown selection depend on the source schema.
    Target ObjectSelectSelect the Salesforce object (table) into which local data will be loaded (input).
    Output OperationSelectSelect the output operation to be performed into the target object. Available operations include Delete, Insert, Update, and Upsert.
    Salesforce IDSelectSelect the unique ID of the row within the Target Object into which the local data will be written.
    This property is only available when Output Operation is set to "Delete", "Update", or "Upsert" (not "Insert").
    Column MappingsSource ColumnsSpecify the columns in the source table that will be unloaded (output).
    Target ColumnsSpecify columns in the target object where the source columns will be output to.
    On WarningsDropdownContinue: Loads data despite records that return an error or that are rejected. This is the default setting.
    Fail: The load will fail at the point that a single record errors or is rejected, aborting the Salesforce bulk job.
    Batch SizeIntegerThe maximum batch size of records. Accepts an integer between 0 – 10,000 when Use Bulk API is set to Yes. The default value is 10,000.
    When Use Bulk API is set to no, accepts an integer between 0 – 2000. The default batch size is 2000.
    Records Per Ingest JobIntegerAn integer for the number of records to load per bulk job. The component loads the entered number of rows (to a maximum of 10,000 per batch) into each Salesforce Ingest Job, and these jobs will load concurrently.
    If this field is empty, Salesforce Output will load all records and batches into Salesforce using a single Salesforce Ingest Job. This field is empty by default.
    If a negative value is specified, the job will fail at runtime.
    This property is only available when Use Bulk API is set to Yes.
    Relationship ColumnsParent ObjectDrop-down list of available parent target objects. For example, if the "child" is User, the "parent" could be Account.
    RelationshipsThe relationship to the Parent Relationship column. For example, OwnerId is a relationship column in Account, but the relationship is named Owner.
    TypeRelationship columns refer to specific target objects. For example, OwnerId in Account refers to User. However, polymorphic objects like OwnerId in Event can refer to Calendar or User, but only one may be used. In the case of customer fields, this will always be.
    Index ColumnThe name of the column that uniquely identifies the Parent Target Object. The User of the Parent Target Object is identified by the Email column in User.
    Capture Rejected EntriesSelectSet this property to On to enable the capture of any rejected/errored records into an exception table, so that they can be flagged for further analysis. This is set to Off by default.
    This property is only available when Use Bulk API is set to Yes.
    Truncate Rejected EntriesSelectWhen set to Yes, errored results are replaced on each run. When No, error results are appended as additional entries on each run. This property is only available when Use Bulk API is set to Yes.
    Rejected Entries DatabaseSelectSelect a database to hold the Rejected Entries table. The default is [Environment Default].
    This property is only available when Capture Rejected Entries is set to On.
    Rejected Entries SchemaSelectSelect a schema from the chosen Rejected Entries Database. The default is [Environment Default].
    This property is only available when Capture Rejected Entries is set to On.
    Rejected Entries TableStringEnter a name for the table that Rejected Entries will be written to. If the table does not already exist, it will be created.
    This property is only available when Capture Rejected Entries is set to On.
    Capture Batch ResultsSelectWhen set to Yes, enables the capture of batch results. This property is only available when Use Bulk API is set to Yes.
    Truncate Batch ResultsSelectWhen set to Yes, enables the truncation of batch results. This property is only available when Use Bulk API is set to Yes.
    Batch Results DatabaseSelectSelect a database to hold the batch results. The default is [Environment Default].
    This property is only available when Capture Batch Results is set to On.
    Batch Results SchemaSelectSelect a schema from the chosen Batch Results Database. The default is [Environment Default].
    This property is only available when Capture Batch Results is set to On.
    Batch Results TableStringEnter a name for the table that batch results will be written to. If the table does not already exist, it will be created.
    This property is only available when Capture Batch Results is set to On.
    Auto DebugSelectOn: select this option to automatically log debug information about the load—these logs can be found in the Task History and should be included in support requests concerning the component.
    Off select this option to override any debugging connection options.
    Debug LevelSelectSelect the desired level of detail of debugging information logged. Beyond 1, huge amounts of data may be logged, resulting in slower execution:
    1: Log the query, the number of rows returned by it, the start of execution and the time taken, and any errors.
    2: Also log cache queries and additional information about the request, if applicable.
    3: Also log the body of the request and the response.
    4: Also log transport-level communication with the data source. This includes SSL negotiation.
    5: Also log communication with the data source and additional details helpful in troubleshooting problems, including interface commands.

    Redshift Properties

    PropertySettingDescription
    NameStringInput a human-readable name for the component.
    Authentication MethodSelectSelect the authentication method. Users can choose between a username/password combination or an OAuth.
    Use SandboxSelectNo: connect to a live Salesforce account. This is the default setting.
    Yes: connect to a sandbox Salesforce account.
    This property is only available when Authentication Method is set to "User/Password".
    UsernameStringProvide a valid Salesforce username.
    This property is only available when Authentication Method is set to "User/Password".
    PasswordStringProvide a valid password corresponding to the Salesforce username. Users can store passwords in the component; however, use of the Password Manager feature is recommended.
    This property is only available when Authentication Method is set to "User/Password".
    Security TokenStringProvide a valid Salesforce security token.
    This property is only available when Authentication Method is set to "User/Password".
    AuthenticationSelectSelect an OAuth entry to authenticate this component. An OAuth entry must be set up in advance. To learn how to create and authorise an OAuth entry, please read our Salesforce Output Authentication Guide.
    This property is only available when Authentication Method is set to "OAuth".
    Use Bulk APISelectNo: write up to 200 rows in real-time. This is the default setting.
    Yes: write up to 10,000 rows asynchronously in the background—this cannot be cancelled before completion.
    Content TypeSelectSelect the content type. When set to ZIP-XML (the default), data is loaded to Salesforce and the Content Type in the Salesforce UI is shown as ZIP-XML. This applies accordingly to each other content type. This parameter is only available when Use Bulk API is set to Yes.
    Connection OptionsParameterA JDBC parameter supported by the database driver—manual setup is not usually required, since sensible defaults are assumed.
    Available parameters are explained in the Data Model.
    ValueA value for the given parameter.
    Source SchemaSelectSelect the source schema of the local data to be output to Salesforce. The special value, [Environment Default], is the default setting, and uses the schema defined in the Matillion ETL environment.
    Source TableSelectSelect the source table from which data will be unloaded (output). The tables available in the dropdown selection depend on the source schema.
    Target ObjectSelectSelect the Salesforce object (table) into which local data will be loaded (input).
    Output OperationSelectSelect the output operation to be performed into the target object. Available operations include Delete, Insert, Update, and Upsert.
    Salesforce IDSelectSelect the unique ID of the row within the Target Object into which the local data will be written.
    This property is only available when Output Operation is set to "Delete", "Update", or "Upsert" (not "Insert").
    Column MappingsSource ColumnsSpecify the columns in the source table that will be unloaded (output).
    Target ColumnsSpecify columns in the target object where the source columns will be output to.
    On WarningsDropdownContinue: Loads data despite records that return an error or that are rejected. This is the default setting.
    Fail: The load will fail at the point that a single record errors or is rejected, aborting the Salesforce bulk job.
    Batch SizeIntegerThe maximum batch size of records. Accepts an integer between 0 – 10,000 when Use Bulk API is set to Yes. The default value is 10,000.
    When Use Bulk API is set to no, accepts an integer between 0 – 2000. The default batch size is 2000.
    Records Per Ingest JobIntegerAn integer for the number of records to load per bulk job. The component loads the entered number of rows (to a maximum of 10,000 per batch) into each Salesforce Ingest Job, and these jobs will load concurrently.
    If this field is empty, Salesforce Output will load all records and batches into Salesforce using a single Salesforce Ingest Job. This field is empty by default.
    If a negative value is specified, the job will fail at runtime.
    This property is only available when Use Bulk API is set to Yes.
    Relationship ColumnsParent ObjectDrop-down list of available parent target objects. For example, if the "child" is User, the "parent" could be Account.
    RelationshipsThe relationship to the Parent Relationship column. For example, OwnerId is a relationship column in Account, but the relationship is named Owner.
    TypeRelationship columns refer to specific target objects. For example, OwnerId in Account refers to User. However, polymorphic objects like OwnerId in Event can refer to Calendar or User, but only one may be used. In the case of customer fields, this will always be.
    Index ColumnThe name of the column that uniquely identifies the Parent Target Object. The User of the Parent Target Object is identified by the Email column in User.
    Capture Rejected EntriesSelectSet this property to On to enable the capture of any rejected/errored records into an exception table, so that they can be flagged for further analysis. This is set to Off by default.
    This property is only available when Use Bulk API is set to Yes.
    Truncate Rejected EntriesSelectWhen set to Yes, errored results are replaced on each run. When No, error results are appended as additional entries on each run. This property is only available when Use Bulk API is set to Yes.
    Rejected Entries DatabaseSelectSelect a database to hold the Rejected Entries table. The default is [Environment Default].
    This property is only available when Capture Rejected Entries is set to On.
    Rejected Entries SchemaSelectSelect a schema from the chosen Rejected Entries Database. The default is [Environment Default].
    This property is only available when Capture Rejected Entries is set to On.
    Rejected Entries TableStringEnter a name for the table that Rejected Entries will be written to. If the table does not already exist, it will be created.
    This property is only available when Capture Rejected Entries is set to On.
    Capture Batch ResultsSelectWhen set to Yes, enables the capture of batch results. This property is only available when Use Bulk API is set to Yes.
    Truncate Batch ResultsSelectWhen set to Yes, enables the truncation of batch results. This property is only available when Use Bulk API is set to Yes.
    Batch Results DatabaseSelectSelect a database to hold the batch results. The default is [Environment Default].
    This property is only available when Capture Batch Results is set to On.
    Batch Results SchemaSelectSelect a schema from the chosen Batch Results Database. The default is [Environment Default].
    This property is only available when Capture Batch Results is set to On.
    Batch Results TableStringEnter a name for the table that batch results will be written to. If the table does not already exist, it will be created.
    This property is only available when Capture Batch Results is set to On.
    Auto DebugSelectOn: select this option to automatically log debug information about the load—these logs can be found in the Task History and should be included in support requests concerning the component.
    Off select this option to override any debugging connection options.
    Debug LevelSelectSelect the desired level of detail of debugging information logged. Beyond 1, huge amounts of data may be logged, resulting in slower execution:
    1: Log the query, the number of rows returned by it, the start of execution and the time taken, and any errors.
    2: Also log cache queries and additional information about the request, if applicable.
    3: Also log the body of the request and the response.
    4: Also log transport-level communication with the data source. This includes SSL negotiation.
    5: Also log communication with the data source and additional details helpful in troubleshooting problems, including interface commands.

    BigQuery Properties

    PropertySettingDescription
    NameStringInput a human-readable name for the component.
    Authentication MethodSelectSelect the authentication method. Users can choose between a username/password combination or an OAuth.
    Use SandboxSelectNo: connect to a live Salesforce account. This is the default setting.
    Yes: connect to a sandbox Salesforce account.
    This property is only available when Authentication Method is set to "User/Password".
    UsernameStringProvide a valid Salesforce username.
    This property is only available when Authentication Method is set to "User/Password".
    PasswordStringProvide a valid password corresponding to the Salesforce username. Users can store passwords in the component; however, use of the Password Manager feature is recommended.
    This property is only available when Authentication Method is set to "User/Password".
    Security TokenStringProvide a valid Salesforce security token.
    This property is only available when Authentication Method is set to "User/Password".
    AuthenticationSelectSelect an OAuth entry to authenticate this component. An OAuth entry must be set up in advance. To learn how to create and authorise an OAuth entry, please read our Salesforce Output Authentication Guide.
    This property is only available when Authentication Method is set to "OAuth".
    Use Bulk APISelectNo: write up to 200 rows in real-time. This is the default setting.
    Yes: write up to 10,000 rows asynchronously in the background—this cannot be cancelled before completion.
    Content TypeSelectSelect the content type. When set to ZIP-XML (the default), data is loaded to Salesforce and the Content Type in the Salesforce UI is shown as ZIP-XML. This applies accordingly to each other content type. This parameter is only available when Use Bulk API is set to Yes.
    Connection OptionsParameterA JDBC parameter supported by the database driver—manual setup is not usually required, since sensible defaults are assumed.
    Available parameters are explained in the Data Model.
    ValueA value for the given parameter.
    ProjectSelectSelect the BigQuery project. The special value, [Environment Default], is the default setting, and uses the project defined in the Matillion ETL environment.
    DatasetSelectSelect the BigQuery dataset. The special value, [Environment Default], is the default setting, and uses the dataset defined in the Matillion ETL environment.
    For more information, see Google's datasets documentation.
    Source TableSelectSelect the source table from which data will be unloaded (output). The tables available in the dropdown selection depend on the source schema.
    Target ObjectSelectSelect the Salesforce object (table) into which local data will be loaded (input).
    Output OperationSelectSelect the output operation to be performed into the target object. Available operations include Delete, Insert, Update, and Upsert.
    Salesforce IDSelectSelect the unique ID of the row within the Target Object into which the local data will be written.
    This property is only available when Output Operation is set to "Delete", "Update", or "Upsert" (not "Insert").
    Column MappingsSource ColumnsSpecify the columns in the source table that will be unloaded (output).
    Target ColumnsSpecify columns in the target object where the source columns will be output to.
    On WarningsDropdownContinue: Loads data despite records that return an error or that are rejected. This is the default setting.
    Fail: The load will fail at the point that a single record errors or is rejected, aborting the Salesforce bulk job.
    Batch SizeIntegerThe maximum batch size of records. Accepts an integer between 0 – 10,000 when Use Bulk API is set to Yes. The default value is 10,000.
    When Use Bulk API is set to no, accepts an integer between 0 – 2000. The default batch size is 2000.
    Records Per Ingest JobIntegerAn integer for the number of records to load per bulk job. The component loads the entered number of rows (to a maximum of 10,000 per batch) into each Salesforce Ingest Job, and these jobs will load concurrently.
    If this field is empty, Salesforce Output will load all records and batches into Salesforce using a single Salesforce Ingest Job. This field is empty by default.
    If a negative value is specified, the job will fail at runtime.
    This property is only available when Use Bulk API is set to Yes.
    Relationship ColumnsParent ObjectDrop-down list of available parent target objects. For example, if the "child" is User, the "parent" could be Account.
    RelationshipsThe relationship to the Parent Relationship column. For example, OwnerId is a relationship column in Account, but the relationship is named Owner.
    TypeRelationship columns refer to specific target objects. For example, OwnerId in Account refers to User. However, polymorphic objects like OwnerId in Event can refer to Calendar or User, but only one may be used. In the case of customer fields, this will always be.
    Index ColumnThe name of the column that uniquely identifies the Parent Target Object. The User of the Parent Target Object is identified by the Email column in User.
    Capture Rejected EntriesSelectSet this property to On to enable the capture of any rejected/errored records into an exception table, so that they can be flagged for further analysis. This is set to Off by default.
    This property is only available when Use Bulk API is set to Yes.
    Truncate Rejected EntriesSelectWhen set to Yes, errored results are replaced on each run. When No, error results are appended as additional entries on each run. This property is only available when Use Bulk API is set to Yes.
    Rejected Entries DatabaseSelectSelect a database to hold the Rejected Entries table. The default is [Environment Default].
    This property is only available when Capture Rejected Entries is set to On.
    Rejected Entries SchemaSelectSelect a schema from the chosen Rejected Entries Database. The default is [Environment Default].
    This property is only available when Capture Rejected Entries is set to On.
    Rejected Entries TableStringEnter a name for the table that Rejected Entries will be written to. If the table does not already exist, it will be created.
    This property is only available when Capture Rejected Entries is set to On.
    Capture Batch ResultsSelectWhen set to Yes, enables the capture of batch results. This property is only available when Use Bulk API is set to Yes.
    Truncate Batch ResultsSelectWhen set to Yes, enables the truncation of batch results. This property is only available when Use Bulk API is set to Yes.
    Batch Results DatabaseSelectSelect a database to hold the batch results. The default is [Environment Default].
    This property is only available when Capture Batch Results is set to On.
    Batch Results SchemaSelectSelect a schema from the chosen Batch Results Database. The default is [Environment Default].
    This property is only available when Capture Batch Results is set to On.
    Batch Results TableStringEnter a name for the table that batch results will be written to. If the table does not already exist, it will be created.
    This property is only available when Capture Batch Results is set to On.
    Auto DebugSelectOn: select this option to automatically log debug information about the load—these logs can be found in the Task History and should be included in support requests concerning the component.
    Off select this option to override any debugging connection options.
    Debug LevelSelectSelect the desired level of detail of debugging information logged. Beyond 1, huge amounts of data may be logged, resulting in slower execution:
    1: Log the query, the number of rows returned by it, the start of execution and the time taken, and any errors.
    2: Also log cache queries and additional information about the request, if applicable.
    3: Also log the body of the request and the response.
    4: Also log transport-level communication with the data source. This includes SSL negotiation.
    5: Also log communication with the data source and additional details helpful in troubleshooting problems, including interface commands.

    Synapse Properties

    PropertySettingDescription
    NameStringInput a human-readable name for the component.
    Authentication MethodSelectSelect the authentication method. Users can choose between a username/password combination or an OAuth.
    Use SandboxSelectNo: connect to a live Salesforce account. This is the default setting.
    Yes: connect to a sandbox Salesforce account.
    This property is only available when Authentication Method is set to "User/Password".
    UsernameStringProvide a valid Salesforce username.
    This property is only available when Authentication Method is set to "User/Password".
    PasswordStringProvide a valid password corresponding to the Salesforce username. Users can store passwords in the component; however, use of the Password Manager feature is recommended.
    This property is only available when Authentication Method is set to "User/Password".
    Security TokenStringProvide a valid Salesforce security token.
    This property is only available when Authentication Method is set to "User/Password".
    AuthenticationSelectSelect an OAuth entry to authenticate this component. An OAuth entry must be set up in advance. To learn how to create and authorise an OAuth entry, please read our Salesforce Output Authentication Guide.
    This property is only available when Authentication Method is set to "OAuth".
    Use Bulk APISelectNo: write up to 200 rows in real-time. This is the default setting.
    Yes: write up to 10,000 rows asynchronously in the background—this cannot be cancelled before completion.
    Content TypeSelectSelect the content type. When set to ZIP-XML (the default), data is loaded to Salesforce and the Content Type in the Salesforce UI is shown as ZIP-XML. This applies accordingly to each other content type. This parameter is only available when Use Bulk API is set to Yes.
    Connection OptionsParameterA JDBC parameter supported by the database driver—manual setup is not usually required, since sensible defaults are assumed.
    Available parameters are explained in the Data Model.
    ValueA value for the given parameter.
    Source SchemaSelectSelect the source schema of the local data to be output to Salesforce. The special value, [Environment Default], is the default setting, and uses the schema defined in the Matillion ETL environment.
    Source TableSelectSelect the source table from which data will be unloaded (output). The tables available in the dropdown selection depend on the source schema.
    Target ObjectSelectSelect the Salesforce object (table) into which local data will be loaded (input).
    Output OperationSelectSelect the output operation to be performed into the target object. Available operations include Delete, Insert, Update, and Upsert.
    Salesforce IDSelectSelect the unique ID of the row within the Target Object into which the local data will be written.
    This property is only available when Output Operation is set to "Delete", "Update", or "Upsert" (not "Insert").
    Column MappingsSource ColumnsSpecify the columns in the source table that will be unloaded (output).
    Target ColumnsSpecify columns in the target object where the source columns will be output to.
    On WarningsDropdownContinue: Loads data despite records that return an error or that are rejected. This is the default setting.
    Fail: The load will fail at the point that a single record errors or is rejected, aborting the Salesforce bulk job.
    Batch SizeIntegerThe maximum batch size of records. Accepts an integer between 0 – 10,000 when Use Bulk API is set to Yes. The default value is 10,000.
    When Use Bulk API is set to no, accepts an integer between 0 – 2000. The default batch size is 2000.
    Records Per Ingest JobIntegerAn integer for the number of records to load per bulk job. The component loads the entered number of rows (to a maximum of 10,000 per batch) into each Salesforce Ingest Job, and these jobs will load concurrently.
    If this field is empty, Salesforce Output will load all records and batches into Salesforce using a single Salesforce Ingest Job. This field is empty by default.
    If a negative value is specified, the job will fail at runtime.
    This property is only available when Use Bulk API is set to Yes.
    Relationship ColumnsParent ObjectDrop-down list of available parent target objects. For example, if the "child" is User, the "parent" could be Account.
    RelationshipsThe relationship to the Parent Relationship column. For example, OwnerId is a relationship column in Account, but the relationship is named Owner.
    TypeRelationship columns refer to specific target objects. For example, OwnerId in Account refers to User. However, polymorphic objects like OwnerId in Event can refer to Calendar or User, but only one may be used. In the case of customer fields, this will always be.
    Index ColumnThe name of the column that uniquely identifies the Parent Target Object. The User of the Parent Target Object is identified by the Email column in User.
    Auto DebugSelectOn: select this option to automatically log debug information about the load—these logs can be found in the Task History and should be included in support requests concerning the component.
    Off select this option to override any debugging connection options.
    Debug LevelSelectSelect the desired level of detail of debugging information logged. Beyond 1, huge amounts of data may be logged, resulting in slower execution:
    1: Log the query, the number of rows returned by it, the start of execution and the time taken, and any errors.
    2: Also log cache queries and additional information about the request, if applicable.
    3: Also log the body of the request and the response.
    4: Also log transport-level communication with the data source. This includes SSL negotiation.
    5: Also log communication with the data source and additional details helpful in troubleshooting problems, including interface commands.


    Video


    What's Next