Informatica PowerExchange for Salesforce - 9.6.1 HotFix 3 ...kb.informatica.com/proddocs/Product...

42
Informatica PowerExchange for Salesforce (Version 9.6.1 HotFix 3) User Guide

Transcript of Informatica PowerExchange for Salesforce - 9.6.1 HotFix 3 ...kb.informatica.com/proddocs/Product...

Page 1: Informatica PowerExchange for Salesforce - 9.6.1 HotFix 3 ...kb.informatica.com/proddocs/Product Documentation/4/PWX_961HF3...documentation. 6

Informatica PowerExchange for Salesforce (Version 9.6.1 HotFix 3)

User Guide

Page 2: Informatica PowerExchange for Salesforce - 9.6.1 HotFix 3 ...kb.informatica.com/proddocs/Product Documentation/4/PWX_961HF3...documentation. 6

Informatica PowerExchange for Salesforce User Guide

Version 9.6.1 HotFix 3June 2015

Copyright (c) 1993-2015 Informatica LLC. All rights reserved.

This software and documentation contain proprietary information of Informatica Corporation and are provided under a license agreement containing restrictions on use and disclosure and are also protected by copyright law. Reverse engineering of the software is prohibited. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying, recording or otherwise) without prior consent of Informatica Corporation. This Software may be protected by U.S. and/or international Patents and other Patents Pending.

Use, duplication, or disclosure of the Software by the U.S. Government is subject to the restrictions set forth in the applicable software license agreement and as provided in DFARS 227.7202-1(a) and 227.7702-3(a) (1995), DFARS 252.227-7013©(1)(ii) (OCT 1988), FAR 12.212(a) (1995), FAR 52.227-19, or FAR 52.227-14 (ALT III), as applicable.

The information in this product or documentation is subject to change without notice. If you find any problems in this product or documentation, please report them to us in writing.

Informatica, Informatica Platform, Informatica Data Services, PowerCenter, PowerCenterRT, PowerCenter Connect, PowerCenter Data Analyzer, PowerExchange, PowerMart, Metadata Manager, Informatica Data Quality, Informatica Data Explorer, Informatica B2B Data Transformation, Informatica B2B Data Exchange Informatica On Demand, Informatica Identity Resolution, Informatica Application Information Lifecycle Management, Informatica Complex Event Processing, Ultra Messaging and Informatica Master Data Management are trademarks or registered trademarks of Informatica Corporation in the United States and in jurisdictions throughout the world. All other company and product names may be trade names or trademarks of their respective owners.

Portions of this software and/or documentation are subject to copyright held by third parties, including without limitation: Copyright DataDirect Technologies. All rights reserved. Copyright © Sun Microsystems. All rights reserved. Copyright © RSA Security Inc. All Rights Reserved. Copyright © Ordinal Technology Corp. All rights reserved.Copyright © Aandacht c.v. All rights reserved. Copyright Genivia, Inc. All rights reserved. Copyright Isomorphic Software. All rights reserved. Copyright © Meta Integration Technology, Inc. All rights reserved. Copyright © Intalio. All rights reserved. Copyright © Oracle. All rights reserved. Copyright © Adobe Systems Incorporated. All rights reserved. Copyright © DataArt, Inc. All rights reserved. Copyright © ComponentSource. All rights reserved. Copyright © Microsoft Corporation. All rights reserved. Copyright © Rogue Wave Software, Inc. All rights reserved. Copyright © Teradata Corporation. All rights reserved. Copyright © Yahoo! Inc. All rights reserved. Copyright © Glyph & Cog, LLC. All rights reserved. Copyright © Thinkmap, Inc. All rights reserved. Copyright © Clearpace Software Limited. All rights reserved. Copyright © Information Builders, Inc. All rights reserved. Copyright © OSS Nokalva, Inc. All rights reserved. Copyright Edifecs, Inc. All rights reserved. Copyright Cleo Communications, Inc. All rights reserved. Copyright © International Organization for Standardization 1986. All rights reserved. Copyright © ej-technologies GmbH. All rights reserved. Copyright © Jaspersoft Corporation. All rights reserved. Copyright © International Business Machines Corporation. All rights reserved. Copyright © yWorks GmbH. All rights reserved. Copyright © Lucent Technologies. All rights reserved. Copyright (c) University of Toronto. All rights reserved. Copyright © Daniel Veillard. All rights reserved. Copyright © Unicode, Inc. Copyright IBM Corp. All rights reserved. Copyright © MicroQuill Software Publishing, Inc. All rights reserved. Copyright © PassMark Software Pty Ltd. All rights reserved. Copyright © LogiXML, Inc. All rights reserved. Copyright © 2003-2010 Lorenzi Davide, All rights reserved. Copyright © Red Hat, Inc. All rights reserved. Copyright © The Board of Trustees of the Leland Stanford Junior University. All rights reserved. Copyright © EMC Corporation. All rights reserved. Copyright © Flexera Software. All rights reserved. Copyright © Jinfonet Software. All rights reserved. Copyright © Apple Inc. All rights reserved. Copyright © Telerik Inc. All rights reserved. Copyright © BEA Systems. All rights reserved. Copyright © PDFlib GmbH. All rights reserved. Copyright © Orientation in Objects GmbH. All rights reserved. Copyright © Tanuki Software, Ltd. All rights reserved. Copyright © Ricebridge. All rights reserved. Copyright © Sencha, Inc. All rights reserved. Copyright © Scalable Systems, Inc. All rights reserved. Copyright © jQWidgets. All rights reserved.

This product includes software developed by the Apache Software Foundation (http://www.apache.org/), and/or other software which is licensed under various versions of the Apache License (the "License"). You may obtain a copy of these Licenses at http://www.apache.org/licenses/. Unless required by applicable law or agreed to in writing, software distributed under these Licenses is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the Licenses for the specific language governing permissions and limitations under the Licenses.

This product includes software which was developed by Mozilla (http://www.mozilla.org/), software copyright The JBoss Group, LLC, all rights reserved; software copyright © 1999-2006 by Bruno Lowagie and Paulo Soares and other software which is licensed under various versions of the GNU Lesser General Public License Agreement, which may be found at http:// www.gnu.org/licenses/lgpl.html. The materials are provided free of charge by Informatica, "as-is", without warranty of any kind, either express or implied, including but not limited to the implied warranties of merchantability and fitness for a particular purpose.

The product includes ACE(TM) and TAO(TM) software copyrighted by Douglas C. Schmidt and his research group at Washington University, University of California, Irvine, and Vanderbilt University, Copyright (©) 1993-2006, all rights reserved.

This product includes software developed by the OpenSSL Project for use in the OpenSSL Toolkit (copyright The OpenSSL Project. All Rights Reserved) and redistribution of this software is subject to terms available at http://www.openssl.org and http://www.openssl.org/source/license.html.

This product includes Curl software which is Copyright 1996-2013, Daniel Stenberg, <[email protected]>. All Rights Reserved. Permissions and limitations regarding this software are subject to terms available at http://curl.haxx.se/docs/copyright.html. Permission to use, copy, modify, and distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies.

The product includes software copyright 2001-2005 (©) MetaStuff, Ltd. All Rights Reserved. Permissions and limitations regarding this software are subject to terms available at http://www.dom4j.org/ license.html.

The product includes software copyright © 2004-2007, The Dojo Foundation. All Rights Reserved. Permissions and limitations regarding this software are subject to terms available at http://dojotoolkit.org/license.

This product includes ICU software which is copyright International Business Machines Corporation and others. All rights reserved. Permissions and limitations regarding this software are subject to terms available at http://source.icu-project.org/repos/icu/icu/trunk/license.html.

This product includes software copyright © 1996-2006 Per Bothner. All rights reserved. Your right to use such materials is set forth in the license which may be found at http:// www.gnu.org/software/ kawa/Software-License.html.

This product includes OSSP UUID software which is Copyright © 2002 Ralf S. Engelschall, Copyright © 2002 The OSSP Project Copyright © 2002 Cable & Wireless Deutschland. Permissions and limitations regarding this software are subject to terms available at http://www.opensource.org/licenses/mit-license.php.

This product includes software developed by Boost (http://www.boost.org/) or under the Boost software license. Permissions and limitations regarding this software are subject to terms available at http:/ /www.boost.org/LICENSE_1_0.txt.

This product includes software copyright © 1997-2007 University of Cambridge. Permissions and limitations regarding this software are subject to terms available at http:// www.pcre.org/license.txt.

This product includes software copyright © 2007 The Eclipse Foundation. All Rights Reserved. Permissions and limitations regarding this software are subject to terms available at http:// www.eclipse.org/org/documents/epl-v10.php and at http://www.eclipse.org/org/documents/edl-v10.php.

This product includes software licensed under the terms at http://www.tcl.tk/software/tcltk/license.html, http://www.bosrup.com/web/overlib/?License, http://www.stlport.org/doc/ license.html, http://asm.ow2.org/license.html, http://www.cryptix.org/LICENSE.TXT, http://hsqldb.org/web/hsqlLicense.html, http://httpunit.sourceforge.net/doc/ license.html, http://jung.sourceforge.net/license.txt , http://www.gzip.org/zlib/zlib_license.html, http://www.openldap.org/software/release/

Page 3: Informatica PowerExchange for Salesforce - 9.6.1 HotFix 3 ...kb.informatica.com/proddocs/Product Documentation/4/PWX_961HF3...documentation. 6

license.html, http://www.libssh2.org, http://slf4j.org/license.html, http://www.sente.ch/software/OpenSourceLicense.html, http://fusesource.com/downloads/license-agreements/fuse-message-broker-v-5-3- license-agreement; http://antlr.org/license.html; http://aopalliance.sourceforge.net/; http://www.bouncycastle.org/licence.html; http://www.jgraph.com/jgraphdownload.html; http://www.jcraft.com/jsch/LICENSE.txt; http://jotm.objectweb.org/bsd_license.html; . http://www.w3.org/Consortium/Legal/2002/copyright-software-20021231; http://www.slf4j.org/license.html; http://nanoxml.sourceforge.net/orig/copyright.html; http://www.json.org/license.html; http://forge.ow2.org/projects/javaservice/, http://www.postgresql.org/about/licence.html, http://www.sqlite.org/copyright.html, http://www.tcl.tk/software/tcltk/license.html, http://www.jaxen.org/faq.html, http://www.jdom.org/docs/faq.html, http://www.slf4j.org/license.html; http://www.iodbc.org/dataspace/iodbc/wiki/iODBC/License; http://www.keplerproject.org/md5/license.html; http://www.toedter.com/en/jcalendar/license.html; http://www.edankert.com/bounce/index.html; http://www.net-snmp.org/about/license.html; http://www.openmdx.org/#FAQ; http://www.php.net/license/3_01.txt; http://srp.stanford.edu/license.txt; http://www.schneier.com/blowfish.html; http://www.jmock.org/license.html; http://xsom.java.net; http://benalman.com/about/license/; https://github.com/CreateJS/EaselJS/blob/master/src/easeljs/display/Bitmap.js; http://www.h2database.com/html/license.html#summary; http://jsoncpp.sourceforge.net/LICENSE; http://jdbc.postgresql.org/license.html; http://protobuf.googlecode.com/svn/trunk/src/google/protobuf/descriptor.proto; https://github.com/rantav/hector/blob/master/LICENSE; http://web.mit.edu/Kerberos/krb5-current/doc/mitK5license.html; http://jibx.sourceforge.net/jibx-license.html; https://github.com/lyokato/libgeohash/blob/master/LICENSE; https://github.com/hjiang/jsonxx/blob/master/LICENSE; https://code.google.com/p/lz4/; https://github.com/jedisct1/libsodium/blob/master/LICENSE; http://one-jar.sourceforge.net/index.php?page=documents&file=license; https://github.com/EsotericSoftware/kryo/blob/master/license.txt; http://www.scala-lang.org/license.html; https://github.com/tinkerpop/blueprints/blob/master/LICENSE.txt; and http://gee.cs.oswego.edu/dl/classes/EDU/oswego/cs/dl/util/concurrent/intro.html.

This product includes software licensed under the Academic Free License (http://www.opensource.org/licenses/afl-3.0.php), the Common Development and Distribution License (http://www.opensource.org/licenses/cddl1.php) the Common Public License (http://www.opensource.org/licenses/cpl1.0.php), the Sun Binary Code License Agreement Supplemental License Terms, the BSD License (http:// www.opensource.org/licenses/bsd-license.php), the new BSD License (http://opensource.org/licenses/BSD-3-Clause), the MIT License (http://www.opensource.org/licenses/mit-license.php), the Artistic License (http://www.opensource.org/licenses/artistic-license-1.0) and the Initial Developer’s Public License Version 1.0 (http://www.firebirdsql.org/en/initial-developer-s-public-license-version-1-0/).

This product includes software copyright © 2003-2006 Joe WaInes, 2006-2007 XStream Committers. All rights reserved. Permissions and limitations regarding this software are subject to terms available at http://xstream.codehaus.org/license.html. This product includes software developed by the Indiana University Extreme! Lab. For further information please visit http://www.extreme.indiana.edu/.

This product includes software Copyright (c) 2013 Frank Balluffi and Markus Moeller. All rights reserved. Permissions and limitations regarding this software are subject to terms of the MIT license.

This Software is protected by U.S. Patent Numbers 5,794,246; 6,014,670; 6,016,501; 6,029,178; 6,032,158; 6,035,307; 6,044,374; 6,092,086; 6,208,990; 6,339,775; 6,640,226; 6,789,096; 6,823,373; 6,850,947; 6,895,471; 7,117,215; 7,162,643; 7,243,110; 7,254,590; 7,281,001; 7,421,458; 7,496,588; 7,523,121; 7,584,422; 7,676,516; 7,720,842; 7,721,270; 7,774,791; 8,065,266; 8,150,803; 8,166,048; 8,166,071; 8,200,622; 8,224,873; 8,271,477; 8,327,419; 8,386,435; 8,392,460; 8,453,159; 8,458,230; 8,707,336; 8,886,617 and RE44,478, International Patents and other Patents Pending.

DISCLAIMER: Informatica Corporation provides this documentation "as is" without warranty of any kind, either express or implied, including, but not limited to, the implied warranties of noninfringement, merchantability, or use for a particular purpose. Informatica Corporation does not warrant that this software or documentation is error free. The information provided in this software or documentation may include technical inaccuracies or typographical errors. The information in this software and documentation is subject to change at any time without notice.

NOTICES

This Informatica product (the "Software") includes certain drivers (the "DataDirect Drivers") from DataDirect Technologies, an operating company of Progress Software Corporation ("DataDirect") which are subject to the following terms and conditions:

1.THE DATADIRECT DRIVERS ARE PROVIDED "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NON-INFRINGEMENT.

2. IN NO EVENT WILL DATADIRECT OR ITS THIRD PARTY SUPPLIERS BE LIABLE TO THE END-USER CUSTOMER FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, CONSEQUENTIAL OR OTHER DAMAGES ARISING OUT OF THE USE OF THE ODBC DRIVERS, WHETHER OR NOT INFORMED OF THE POSSIBILITIES OF DAMAGES IN ADVANCE. THESE LIMITATIONS APPLY TO ALL CAUSES OF ACTION, INCLUDING, WITHOUT LIMITATION, BREACH OF CONTRACT, BREACH OF WARRANTY, NEGLIGENCE, STRICT LIABILITY, MISREPRESENTATION AND OTHER TORTS.

Part Number: PWX-SFDC-UG-96100-HF3-0001

Page 4: Informatica PowerExchange for Salesforce - 9.6.1 HotFix 3 ...kb.informatica.com/proddocs/Product Documentation/4/PWX_961HF3...documentation. 6

Table of Contents

Preface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6Informatica Resources. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

Informatica My Support Portal. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

Informatica Documentation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

Informatica Product Availability Matrixes. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7

Informatica Web Site. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7

Informatica How-To Library. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7

Informatica Knowledge Base. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7

Informatica Support YouTube Channel. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7

Informatica Marketplace. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7

Informatica Velocity. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7

Informatica Global Customer Support. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8

Chapter 1: Introduction to PowerExchange for Salesforce. . . . . . . . . . . . . . . . . . . . . . 9PowerExchange for Salesforce Overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9

Example of Data Migration from Salesforce. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10

Example of Updating Real-time Data to Salesforce. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10

Chapter 2: PowerExchange for Salesforce Installation and Configuration. . . . . . 11PowerExchange for Salesforce Installation and Configuration Overview. . . . . . . . . . . . . . . . . . . 11

Prerequisites. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

Installing the Server Component. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12

Installing the Server Component on Windows. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12

Installing the Server Component on UNIX. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13

Installing the Client Component. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14

Configuring HTTP Proxy Options at Design Time. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14

Configuring HTTP Proxy Options from the Developer Tool. . . . . . . . . . . . . . . . . . . . . . . . 14

Configuring HTTP Proxy Options at Run Time. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15

Chapter 3: Salesforce Connections. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16Salesforce Connection Overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16

Salesforce Connection Properties. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16

infacmd Connection Properties. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17

Creating a Salesforce Connection in the Administrator Tool. . . . . . . . . . . . . . . . . . . . . . . . . . 18

Creating a Salesforce Connection in the Developer Tool. . . . . . . . . . . . . . . . . . . . . . . . . . . . 19

Chapter 4: Salesforce Data Objects. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20Salesforce Data Objects Overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20

Standard and Custom Salesforce Objects. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20

Related Objects. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21

4 Table of Contents

Page 5: Informatica PowerExchange for Salesforce - 9.6.1 HotFix 3 ...kb.informatica.com/proddocs/Product Documentation/4/PWX_961HF3...documentation. 6

Rules and Guidelines for Related Objects. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21

Salesforce Data Object Views. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21

Salesforce Data Object Overview Properties. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21

Salesforce Data Object Read Operation Properties. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22

Source Properties of the Data Object Read Operation. . . . . . . . . . . . . . . . . . . . . . . . . . . 22

Output Properties of the Data Object Read Operation. . . . . . . . . . . . . . . . . . . . . . . . . . . 24

Salesforce Data Object Write Operation Properties. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25

Input Properties of the Data Object Write Operation. . . . . . . . . . . . . . . . . . . . . . . . . . . . 25

Target Properties of the Data Object Write Operation. . . . . . . . . . . . . . . . . . . . . . . . . . . 27

Importing a Salesforce Data Object. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28

Creating a Salesforce Data Object Read or Write Operation. . . . . . . . . . . . . . . . . . . . . . . . . . 29

Chapter 5: Salesforce Mappings. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30Salesforce Mappings Overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30

Salesforce Mapping Read Example. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30

Salesforce Mapping Write Example. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31

Chapter 6: Salesforce Run Time Processing. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32Salesforce Run-time Processing Overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32

Filtering Source Data by Using the SOQL Filter Condition. . . . . . . . . . . . . . . . . . . . . . . . . . . . 32

Filtering Source Data by Using the Informatica Filter Condition. . . . . . . . . . . . . . . . . . . . . . . . 33

Pushdown Optimization. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33

Pushdown Optimization Expressions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33

Capturing Deleted and Archived Salesforce Records. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34

Capturing Changed Data. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34

Continuous CDC Mapping. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35

CDC Flush Interval Offset. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35

Configure a Continuous CDC. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35

Time-Period Based CDC Mapping. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36

Configuring a Time-Period Based CDC. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36

Rules and Guidelines for Processing a Time-Period Based CDC Mapping. . . . . . . . . . . . . . 36

Enable Bulk Query. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36

Use SFDC Bulk API. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37

Configuring the Upsert Target Operation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37

Configuring the Maximum Batch Size. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38

Handling Null Values in Update and Upsert Operations. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38

Override an External ID with an idLookup for Upserts. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38

Appendix A: Data Type Reference. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39Data Type Reference Overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39

Salesforce Data Types and Transformation Data Types. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39

Index. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41

Table of Contents 5

Page 6: Informatica PowerExchange for Salesforce - 9.6.1 HotFix 3 ...kb.informatica.com/proddocs/Product Documentation/4/PWX_961HF3...documentation. 6

PrefaceThe Informatica PowerExchange for Salesforce User Guide provides information to build Salesforce mappings, extract data from Salesforce objects, and load data into Salesforce objects. It is written for the developers who are responsible for extracting data from Salesforce objects and loading data into Salesforce objects.

This book assumes that you have knowledge of web services concepts, relational database concepts, PowerExchange, and Salesforce. You must also be familiar with the interface requirements for other supporting applications. For more information about related Salesforce issues, see the Salesforce documentation.

Informatica Resources

Informatica My Support PortalAs an Informatica customer, you can access the Informatica My Support Portal at http://mysupport.informatica.com.

The site contains product information, user group information, newsletters, access to the Informatica customer support case management system (ATLAS), the Informatica How-To Library, the Informatica Knowledge Base, Informatica Product Documentation, and access to the Informatica user community.

The site contains product information, user group information, newsletters, access to the Informatica How-To Library, the Informatica Knowledge Base, Informatica Product Documentation, and access to the Informatica user community.

Informatica DocumentationThe Informatica Documentation team makes every effort to create accurate, usable documentation. If you have questions, comments, or ideas about this documentation, contact the Informatica Documentation team through email at [email protected]. We will use your feedback to improve our documentation. Let us know if we can contact you regarding your comments.

The Documentation team updates documentation as needed. To get the latest documentation for your product, navigate to Product Documentation from http://mysupport.informatica.com.

6

Page 7: Informatica PowerExchange for Salesforce - 9.6.1 HotFix 3 ...kb.informatica.com/proddocs/Product Documentation/4/PWX_961HF3...documentation. 6

Informatica Product Availability MatrixesProduct Availability Matrixes (PAMs) indicate the versions of operating systems, databases, and other types of data sources and targets that a product release supports. You can access the PAMs on the Informatica My Support Portal at https://mysupport.informatica.com/community/my-support/product-availability-matrices.

Informatica Web SiteYou can access the Informatica corporate web site at http://www.informatica.com. The site contains information about Informatica, its background, upcoming events, and sales offices. You will also find product and partner information. The services area of the site includes important information about technical support, training and education, and implementation services.

Informatica How-To LibraryAs an Informatica customer, you can access the Informatica How-To Library at http://mysupport.informatica.com. The How-To Library is a collection of resources to help you learn more about Informatica products and features. It includes articles and interactive demonstrations that provide solutions to common problems, compare features and behaviors, and guide you through performing specific real-world tasks.

Informatica Knowledge BaseAs an Informatica customer, you can access the Informatica Knowledge Base at http://mysupport.informatica.com. Use the Knowledge Base to search for documented solutions to known technical issues about Informatica products. You can also find answers to frequently asked questions, technical white papers, and technical tips. If you have questions, comments, or ideas about the Knowledge Base, contact the Informatica Knowledge Base team through email at [email protected].

Informatica Support YouTube ChannelYou can access the Informatica Support YouTube channel at http://www.youtube.com/user/INFASupport. The Informatica Support YouTube channel includes videos about solutions that guide you through performing specific tasks. If you have questions, comments, or ideas about the Informatica Support YouTube channel, contact the Support YouTube team through email at [email protected] or send a tweet to @INFASupport.

Informatica MarketplaceThe Informatica Marketplace is a forum where developers and partners can share solutions that augment, extend, or enhance data integration implementations. By leveraging any of the hundreds of solutions available on the Marketplace, you can improve your productivity and speed up time to implementation on your projects. You can access Informatica Marketplace at http://www.informaticamarketplace.com.

Informatica VelocityYou can access Informatica Velocity at http://mysupport.informatica.com. Developed from the real-world experience of hundreds of data management projects, Informatica Velocity represents the collective knowledge of our consultants who have worked with organizations from around the world to plan, develop, deploy, and maintain successful data management solutions. If you have questions, comments, or ideas about Informatica Velocity, contact Informatica Professional Services at [email protected].

Preface 7

Page 8: Informatica PowerExchange for Salesforce - 9.6.1 HotFix 3 ...kb.informatica.com/proddocs/Product Documentation/4/PWX_961HF3...documentation. 6

Informatica Global Customer SupportYou can contact a Customer Support Center by telephone or through the Online Support.

Online Support requires a user name and password. You can request a user name and password at http://mysupport.informatica.com.

The telephone numbers for Informatica Global Customer Support are available from the Informatica web site at http://www.informatica.com/us/services-and-training/support-services/global-support-centers/.

8 Preface

Page 9: Informatica PowerExchange for Salesforce - 9.6.1 HotFix 3 ...kb.informatica.com/proddocs/Product Documentation/4/PWX_961HF3...documentation. 6

C H A P T E R 1

Introduction to PowerExchange for Salesforce

This chapter includes the following topics:

• PowerExchange for Salesforce Overview, 9

• Example of Data Migration from Salesforce, 10

• Example of Updating Real-time Data to Salesforce, 10

PowerExchange for Salesforce OverviewPowerExchange for Salesforce provides connectivity between Informatica Developer and Salesforce. You can use PowerExchange for Salesforce to read data from and write data to Salesforce. You can add a Salesforce data object operation as a source or a target in a mapping and run the mapping to read or write data.

Salesforce sources and targets represent objects in the Salesforce object model. Salesforce objects are tables that correspond to tabs and other user interface elements on the Salesforce website. For example, the Account object contains the information that appears in fields on the Salesforce Account tab. You can view, create, update, and delete data in Salesforce objects.

PowerExchange for Salesforce uses the Salesforce security model to enforce data access controls. You can access data based on the Salesforce organization associated with the user account you use to connect to Salesforce. Your access to data also depends on the user privileges and the field-level and row-level permissions associated with the login.

PowerExchange for Salesforce uses the Simple Object Access Protocol API (SOAP API) to read or write a small volume of data in near real-time mode. PowerExchange for Salesforce uses the Salesforce Bulk API to read large amounts of data from Salesforce sources or write large amounts of data to Salesforce targets. PowerExchange for Salesforce generates a Salesforce Object Query Language (SOQL) query to read data from Salesforce objects. SOQL is a derivative of SQL.

You can run a profile on a Salesforce data object. A Salesforce data object profile discovers information about the column data and metadata in the Salesforce data source.

PowerExchange for Salesforce is listed under the Cloud connection category in the Developer tool and the Administrator tool.

PowerExchange for Salesforce supports Salesforce API v31 and connections to multiple versions of the Salesforce API. You can use PowerExchange for Salesforce to connect to Salesforce API v30 and v31.

9

Page 10: Informatica PowerExchange for Salesforce - 9.6.1 HotFix 3 ...kb.informatica.com/proddocs/Product Documentation/4/PWX_961HF3...documentation. 6

Example of Data Migration from SalesforceYour organization needs to migrate sales opportunity information from a Salesforce system that the sales team uses to a relational data source that the executive management team uses. You can create a data object in the Model repository and import the Opportunity object. The executive management team can reconcile and analyze the data written to the relational data object.

Example of Updating Real-time Data to SalesforceYour organization needs to update real-time sales order processing status from an enterprise resource planning (ERP) system that the logistics team uses to a Salesforce system that was used to create the order. You can create a data object and specify the update strategy. You can then create a mapping that reads shipping details from the ERP system and writes those records to the Salesforce data objects. Sales managers can use the updated information to track sales orders.

10 Chapter 1: Introduction to PowerExchange for Salesforce

Page 11: Informatica PowerExchange for Salesforce - 9.6.1 HotFix 3 ...kb.informatica.com/proddocs/Product Documentation/4/PWX_961HF3...documentation. 6

C H A P T E R 2

PowerExchange for Salesforce Installation and Configuration

This chapter includes the following topics:

• PowerExchange for Salesforce Installation and Configuration Overview, 11

• Prerequisites, 11

• Installing the Server Component, 12

• Installing the Client Component, 14

• Configuring HTTP Proxy Options at Design Time, 14

• Configuring HTTP Proxy Options at Run Time, 15

PowerExchange for Salesforce Installation and Configuration Overview

The PowerExchange for Salesforce installation consists of a server installation and a client installation.

Install the PowerExchange for Salesforce server component after you install the Informatica services. The server binaries are copied to the Informatica installation directory. Use the Model Repository Service to store and access the Salesforce objects in the repository. Use the Data Integration Service to run mappings.

Install the PowerExchange for Salesforce client component after you install the Informatica clients. Use the client component to create a connection, import Salesforce objects, create a data object and data object operation, and create mappings using the Developer tool.

PrerequisitesBefore you install PowerExchange for Salesforce, install and configure the Informatica services and clients. Create a Data Integration Service and a Model Repository Service in the Informatica domain.

Before you install the server component, perform the following steps:

• Back up the domain configuration repository.

• Back up the Model repository.

11

Page 12: Informatica PowerExchange for Salesforce - 9.6.1 HotFix 3 ...kb.informatica.com/proddocs/Product Documentation/4/PWX_961HF3...documentation. 6

• Identify the node that you want to serve as the master gateway node.

• Shut down the domain.

Installing the Server ComponentThe PowerExchange for Salesforce server component installs the Data Integration Service and Model Repository Service components.

If multiple nodes exist in your environment, you must first install the server component on the master gateway node. You can then install the server component on the other nodes in the domain.

You can install the server component on Windows or Linux machines.

Installing the Server Component on Windows

1. Navigate to the root directory of the extracted installer files.

2. Run install.bat from the installation package.

The Welcome page appears.

3. Click Next.

The Installation Directory page appears.

4. Enter the absolute path to the Informatica installation directory. Click Browse to find the directory or use the default directory.

By default, the server components are installed in the following location:C:\Informatica\<version folder>\

If you did not shut down the domain, a message appears asking you to shut down the domain.

5. Click Next.

The Pre-Installation Summary page appears.

6. Verify that all installation requirements are met and click Install.

The Domain Information Panel page appears.

7. View or enter the domain information:

Property Description

Domain Name Name of the domain where Informatica services are installed.This field is read-only.

Node Name Name of the node on which you are installing the PowerExchange for Salesforce server component.This field is read-only.

Domain User Name User name of the administrator for the domain.

12 Chapter 2: PowerExchange for Salesforce Installation and Configuration

Page 13: Informatica PowerExchange for Salesforce - 9.6.1 HotFix 3 ...kb.informatica.com/proddocs/Product Documentation/4/PWX_961HF3...documentation. 6

Property Description

Domain Password Password for the domain administrator.

Master Gateway Node Indicates whether the node on which you are installing the server component is the master gateway node.Select the option for the master gateway node. Clear the option for all other nodes on which you install the server component.

8. Click Next.

The installer shows the progress of the installation. When the installation is complete, the Post-Installation Summary page displays the status of the installation.

9. Click Done to close the installer.

For more information about the tasks performed by the installer, view the installation log files.

Installing the Server Component on UNIX

1. Navigate to the root directory of the extracted installer files.

2. Enter ./install.sh at the command prompt.

Note: The install.sh file must have executable permissions.

3. Enter the path to the Informatica installation directory.

By default, the server components are installed in the following location:<User Home Directory>/Informatica/<version folder>

If you did not shut down the domain, a message appears asking you to shut down the domain.

4. Review the installation information and press Enter to begin the installation.

5. View or enter the information of the domain:

Property Description

Domain Name Name of the domain where Informatica services is installed.This field is read-only.

Node Name Name of the node on which you are installing the PowerExchange for Salesforce server component.This field is read-only.

Domain User Name User name of the administrator for the domain.

Domain Password Password for the domain administrator.

Master Gateway Node Indicates whether the node on which you are installing the server component is the master gateway node.Select from the following options:Yes. Select Yes for the node that you want to serve as the master gateway node.No. Select No for all other nodes on which you install the server component.

For more information about the tasks performed by the installer, view the installation log files.

Installing the Server Component 13

Page 14: Informatica PowerExchange for Salesforce - 9.6.1 HotFix 3 ...kb.informatica.com/proddocs/Product Documentation/4/PWX_961HF3...documentation. 6

Installing the Client ComponentInstall the client component on every Informatica Developer client machine that connects to the domain where the PowerExchange for Salesforce server component is installed.

1. Unzip the installation archive and navigate to the root directory of the extracted installer files.

2. Run the install.bat script file.

The Welcome appears.

3. Click Next.

The Installation Directory page appears.

4. Enter the absolute path to the Informatica installation directory. Click the Browse button to find the directory or use the default directory.

5. Click Next.

The Pre-Installation Summary page appears.

6. Verify that all installation requirements are met and click Install.

The installer shows the progress of the installation. When the installation is complete, the Post-Installation Summary page displays the status of the installation.

7. Click Done to close the installer.

For more information about the tasks performed by the installer, view the installation log files.

Configuring HTTP Proxy Options at Design TimeIf your organization uses a proxy server to access the internet, you can configure the HTTP proxy server authentication settings at design time. You can configure the HTTP proxy server authentication by using the developerCore.ini file.

Configuring HTTP Proxy Options from the Developer ToolIf your organization uses a proxy server to access the Internet, you can configure the HTTP proxy server authentication settings from the developerCore.ini file.

Perform the following tasks to configure the HTTP Proxy Options from the Developer tool:

• Ensure that you enable the proxy server settings from your web browser.

• Access the developerCore.ini file at the following location:

<Informatica Installation Location>\clients\DeveloperClient• Add the HTTP Proxy options to the developerCore.ini file.

14 Chapter 2: PowerExchange for Salesforce Installation and Configuration

Page 15: Informatica PowerExchange for Salesforce - 9.6.1 HotFix 3 ...kb.informatica.com/proddocs/Product Documentation/4/PWX_961HF3...documentation. 6

The following table describes the properties that you must add to the developerCore.ini file:

Property Description

-Dhttp.proxyHost= Name of the HTTP proxy server.

-Dhttp.proxyPort= Port number of the HTTP proxy server.

-Dhttp.proxyUser= Authenticated user name for the HTTP proxy server. This property is required if the proxy server requires authentication.

-Dhttp.proxyPassword= Password for the authenticated user. This property is required if the proxy server requires authentication.Note: The password is in plaintext and not encrypted.

-Dhttp.nonProxyHosts= List of host names or IP addresses for which you must not use the proxy server.Separate the list of IP addresses or host names with a pipe symbol (|). For example, localhost:10.20.30.40|myHostSpecify the IP address or name of the machine on which the Informatica gateway node runs so that the Developer tool connects to the domain.

-Dhttps.proxyHost= Name of the HTTPS proxy server.

-Dhttps.proxyPort= Port number of the HTTPS proxy server.

Configuring HTTP Proxy Options at Run TimeIf your organization uses a proxy server to access the internet, you must configure the HTTP proxy server authentication settings for the Data Integration Service.

1. Open the Administrator tool.

2. Click the Administration tab, and then select the Data Integration Service.

3. Click the Properties tab.

4. Click Edit in the HTTP Proxy Server Properties section.

5. Configure the following properties:

Property Description

HTTP Proxy Server Host Name of the HTTP proxy server.

HTTP Proxy Server Port Port number of the HTTP proxy server. Default is 8080.

HTTP Proxy Server User Authenticated user name for the HTTP proxy server. This property is required if the proxy server requires authentication.

HTTP Proxy Server Password Password for the authenticated user. This property is required if the proxy server requires authentication.

HTTP Proxy Server Domain Domain for authentication.

Configuring HTTP Proxy Options at Run Time 15

Page 16: Informatica PowerExchange for Salesforce - 9.6.1 HotFix 3 ...kb.informatica.com/proddocs/Product Documentation/4/PWX_961HF3...documentation. 6

C H A P T E R 3

Salesforce ConnectionsThis chapter includes the following topics:

• Salesforce Connection Overview, 16

• Salesforce Connection Properties, 16

• infacmd Connection Properties, 17

• Creating a Salesforce Connection in the Administrator Tool, 18

• Creating a Salesforce Connection in the Developer Tool, 19

Salesforce Connection OverviewUse a Salesforce connection to access objects in a Salesforce application.

Create a connection to import Salesforce metadata to create data objects, preview data, and run mappings.

You can create a Salesforce connection in the Developer tool, the Administrator tool, and through infacmd isp.

Salesforce Connection PropertiesUse a Salesforce connection to connect to a Salesforce object.

The following table describes the Salesforce connection properties:

Property Description

Name The name of the connection. The name is not case sensitive and must be unique within the domain. It cannot exceed 128 characters, contain spaces, or contain the following special characters:~ ` ! $ % ^ & * ( ) - + = { [ } ] | \ : ; " ' < , > . ? /

ID The string that the Data Integration Service uses to identify the connection. The ID is not case sensitive. It must be 255 characters or less and must be unique in the domain. You cannot change this property after you create the connection. Default value is the connection name.

16

Page 17: Informatica PowerExchange for Salesforce - 9.6.1 HotFix 3 ...kb.informatica.com/proddocs/Product Documentation/4/PWX_961HF3...documentation. 6

Property Description

Description The description of the connection. The description cannot exceed 765 characters.

Location The Informatica domain where you want to create the connection.

Type The connection type. Select Salesforce.

User Name The Salesforce user name.

User Password The password for the Salesforce user name.To access Salesforce outside your organization's trusted networks, you must append a security token to your password to log in to the API or a desktop client.To receive or reset your security token, log in to Salesforce and click Setup | My Personal Information | Reset My Security Token.Password is case sensitive.

Service URL The URL of the Salesforce service you want to access. In a test or development environment, you might want to access the Salesforce Sandbox testing environment. For more information about the Salesforce Sandbox, see the Salesforce documentation.

infacmd Connection PropertiesYou can create a Salesforce connection with the create connection commands. You can update a Salesforce connection with the update connection commands.

Enter connection options in the following format:

... -o option_name=value option_name=value ...

For example,

infacmd createConnection -dn DomainName -un Domain_UserName -pd Domain_Pwd -cn conname -cid conname -ct salesforce -o userName=salesforceUserName password=salesforcePWD service_URL=https://login.salesforce.com/services/Soap/u/30.0

To enter multiple options, separate them with a space. To enter a value that contains a space or other non-alphanumeric character, enclose the value in quotation marks.

infacmd Connection Properties 17

Page 18: Informatica PowerExchange for Salesforce - 9.6.1 HotFix 3 ...kb.informatica.com/proddocs/Product Documentation/4/PWX_961HF3...documentation. 6

The following table describes the Salesforce connection options for the infacmd isp CreateConnection and UpdateConnection commands:

Property Description

userName Salesforce user name.

password Password for the Salesforce user name. The password is case sensitive.To access Salesforce outside the trusted network of your organization, you must append a security token to your password to log in to the API or a desktop client.To receive or reset your security token, log in to Salesforce and click Setup | My Personal Information | Reset My Security Token.

serviceURL URL of the Salesforce service that you want to access. In a test or development environment, you might want to access the Salesforce Sandbox testing environment. For more information about the Salesforce Sandbox, see the Salesforce documentation.

Creating a Salesforce Connection in the Administrator Tool

Create a connection before you import Salesforce data objects, preview data, or run mappings. When you create a Salesforce connection, you enter information such as a connection ID and the URL of the Salesforce service you want to access.

1. In the Administrator tool, click the Domain tab.

2. Click the Connections view.

3. In the Navigator, select the domain.

4. In the Navigator, click Actions > New > Connection.

The New Connection dialog box appears.

5. In the New Connection dialog box, select Cloud > Salesforce, and then click OK.

The New Connection wizard appears.

6. Enter a connection name.

7. Enter an ID for the connection.

8. Optionally, enter a connection description.

9. Enter the connection properties.

10. Click Test Connection to verify that you can connect to Salesforce.

11. Click Finish.

18 Chapter 3: Salesforce Connections

Page 19: Informatica PowerExchange for Salesforce - 9.6.1 HotFix 3 ...kb.informatica.com/proddocs/Product Documentation/4/PWX_961HF3...documentation. 6

Creating a Salesforce Connection in the Developer Tool

Create a connection before you import Salesforce data objects, preview data, or run mappings. When you create a Salesforce connection, you enter information such as a connection ID and the URL of the Salesforce service you want to access.

1. Click Window > Preferences.

2. Select Informatica > Connections.

3. Expand the domain.

4. Select Cloud > Salesforce and click Add.

5. Enter a connection name.

6. Enter an ID for the connection.

7. Optionally, enter a connection description.

8. Select the domain where you want to create the connection.

9. Select Salesforce as the connection type.

10. Click Next.

11. Configure the connection properties.

12. Click Test Connection to verify that you can connect to the Salesforce system.

13. Click Finish.

Creating a Salesforce Connection in the Developer Tool 19

Page 20: Informatica PowerExchange for Salesforce - 9.6.1 HotFix 3 ...kb.informatica.com/proddocs/Product Documentation/4/PWX_961HF3...documentation. 6

C H A P T E R 4

Salesforce Data ObjectsThis chapter includes the following topics:

• Salesforce Data Objects Overview, 20

• Standard and Custom Salesforce Objects, 20

• Related Objects, 21

• Salesforce Data Object Views, 21

• Salesforce Data Object Overview Properties, 21

• Salesforce Data Object Read Operation Properties, 22

• Salesforce Data Object Write Operation Properties, 25

• Importing a Salesforce Data Object, 28

• Creating a Salesforce Data Object Read or Write Operation, 29

Salesforce Data Objects OverviewA Salesforce data object is a physical data object that uses a Salesforce object as a source and a target. A Salesforce data object is a representation of data that is based on a Salesforce object.

Import a Salesforce object into the Developer tool to create a Salesforce data object. After you create a data object, create a data object read or write operation. You can use the data object read operation as a source and the data object write operation as a target in a mapping.

Standard and Custom Salesforce ObjectsUse the Developer tool to import Salesforce objects and create a Salesforce data object. You can import both standard and custom Salesforce objects.

Standard object types are objects packaged within Salesforce, such as Account, AccountPartner, and Opportunity.

Custom object types extend the Salesforce data for an organization by defining data entities that are unique to the organization. Salesforce administrators can define custom fields for both standard and custom objects.

When you import a Salesforce object, use a Salesforce login to connect to the Salesforce service. The Developer tool generates a list of objects that are available for import.

20

Page 21: Informatica PowerExchange for Salesforce - 9.6.1 HotFix 3 ...kb.informatica.com/proddocs/Product Documentation/4/PWX_961HF3...documentation. 6

Related ObjectsYou might need to read data from more than one object at a time. The Data Integration Service generates relationship queries through SOQL to read data from related objects.

For example, you can read all accounts created by Tom Smith and the contacts associated with those accounts. You can use PowerExchange for Salesforce to create parent-to-child relationships that connect the objects.Parent-to-child relationships exist between many types of objects. For example, Account is a parent of Contact, Assets, and Cases.

Use PowerExchange for Salesforce to read related objects. Each object can have one related object. For example, you can create a data object called Account Details. Select Account as the parent object, and either Contact or Opportunity as the child object. The relationship persists while creating a Salesforce data object read operation from the Salesforce data object called Account Details.

Rules and Guidelines for Related ObjectsConsider the following rules and guidelines when you import related objects in a Salesforce data object:

• You must select a parent object to create a data object that has related objects.

• You cannot import multiple parent objects in a single data object.

• You can select one related object for each parent object.

• You cannot read data from a related object while using Bulk API. You can read data from one parent object.

Salesforce Data Object ViewsThe Salesforce data object contains views to edit the object name and the properties.

After you create a Salesforce data object, you can change the data object properties in the following data object views:

• Overview view. Edit the Salesforce data object name, description, and object.

• Data Object Operation view. View and edit the properties that the Data Integration Service uses when it reads data from or writes data to a Salesforce data object.

When you create a mapping that uses a Salesforce source, you can view the data object read properties in the Properties view.

When you create a mapping that uses a Salesforce target, you can view the data object write properties in the Properties view.

Salesforce Data Object Overview PropertiesThe Overview view displays general information about the Salesforce data object and detailed information about the Salesforce object that you imported.

Related Objects 21

Page 22: Informatica PowerExchange for Salesforce - 9.6.1 HotFix 3 ...kb.informatica.com/proddocs/Product Documentation/4/PWX_961HF3...documentation. 6

The following table describes the general properties that you configure for a Salesforce data object:

Property Description

Name Name of the Salesforce data object.

Description Description of the Salesforce data object.

Connection Name of the Salesforce connection.

The following table describes the Salesforce object properties that you can view:

Property Description

Name Name of the Salesforce object.

Type Native data type of the Salesforce object.

Description Description of the Salesforce object.

Salesforce Data Object Read Operation PropertiesThe Data Integration Service reads data from a Salesforce object based on the data object read operation. The Developer tool displays the data object read operation properties of the Salesforce data object in the Data Object Operation view.

You can view or configure the data object read operation from the source and output properties.Source properties

Represents data that the Data Integration Service reads from the Salesforce object. Select the source properties to view data such as the name and description of the Salesforce object and the column properties.

Output properties

Represents data that the Data Integration Service passes into the mapping pipeline. Select the output properties to edit the port properties of the data object read operation. You can also set advanced properties, such as row limit and Salesforce bulk API.

Source Properties of the Data Object Read OperationThe source properties are populated based on the Salesforce object that you added when you created a data object. The source properties of the data object read operation include general and column properties that apply to the Salesforce object.

You can view the source properties of the data object read operation from the General, Column, and Advanced tabs.

General PropertiesThe general properties display the name and description of the data object object read operation.

22 Chapter 4: Salesforce Data Objects

Page 23: Informatica PowerExchange for Salesforce - 9.6.1 HotFix 3 ...kb.informatica.com/proddocs/Product Documentation/4/PWX_961HF3...documentation. 6

Column PropertiesThe column properties display the data types, precision, and scale of the source property in the data object read operation.

The following table describes the source column properties of the data object read operation:

Property Description

Name Name of the column

Type Native data type of the column

Precision Maximum number of significant digits for numeric data types, or maximum number of characters for string data types. For numeric data types, precision includes scale.

Scale Maximum number of digits after the decimal point for numeric values

Description Description of the column

Creatable Indicates whether the field allows inserts

Updateable Indicates whether the field allows updates

ExternalD Salesforce custom fields only. Indicates whether the field is designated as an external ID field.Each Salesforce object can contain a single custom field designated as the external ID field. Salesforce appends custom field names with “__c”.For more information about external ID and custom fields, see the Salesforce documentation.

SforceName Field name in Salesforce

ReferenceTo Gets referenced object

IDLookup Specifies a record in an upsert call. The ID field of each object and some Name fields have this property. There are exceptions, so use Salesforce to check for this property in any object that you want to upsert.

Filterable Indicates whether the field can be used in the FROM or WHERE clause of an SOQL query.

Label Field label in Salesforce

Access Type Indicates whether the field has read and write permissions.

Advanced PropertiesThe advanced properties display the physical name of the Salesforce object.

Salesforce Data Object Read Operation Properties 23

Page 24: Informatica PowerExchange for Salesforce - 9.6.1 HotFix 3 ...kb.informatica.com/proddocs/Product Documentation/4/PWX_961HF3...documentation. 6

Output Properties of the Data Object Read OperationThe output properties represent data that the Data Integration Service passes into the mapping pipeline. Select the output properties to edit the port properties of the data object read operation.

The output properties of the data object read operation include general properties that apply to the data object operation. The output properties also include port, source, query, and advanced properties that apply to the Salesforce object.

You can view and change the output properties of the data object read operation from the General, Ports, Sources, Query, and Advanced tabs.

General PropertiesThe general properties display the name and description of the data object object read operation.

Ports PropertiesThe output ports properties display the data types, precision, and scale of the data object read operation.

The following table describes the output ports properties that you configure in the data object read operation:

Property Description

Name Name of the port.

Type Data type of the port.

Precision Maximum number of significant digits for numeric data types, or maximum number of characters for string data types. For numeric data types, precision includes scale.

Scale Maximum number of digits after the decimal point for numeric values.

Description Description of the port.

Sources PropertiesThe sources properties list the Salesforce objects in the data object read operation.

Advanced PropertiesUse the advanced properties to specify the data object read operation properties to read data from Salesforce objects.

The following table describes the advanced properties that you configure in the data object read operation:

Property Description

SOQL Filter Condition Filters Salesforce source records.

Row Limit Specifies the maximum number of rows the Data Integration Service processes. Default is 0, which indicates that the Data Integration Services processes all records.

24 Chapter 4: Salesforce Data Objects

Page 25: Informatica PowerExchange for Salesforce - 9.6.1 HotFix 3 ...kb.informatica.com/proddocs/Product Documentation/4/PWX_961HF3...documentation. 6

Property Description

Use QueryAll Runs a query that returns all rows, which includes active, archived, and deleted rows. Otherwise, the Data Integration Service returns active rows.

Enable Bulk Query Enable this feature to use the Salesforce Bulk API to read batch files containing large amounts of data. By default, the Data Integration Service uses the SOAP Salesforce API.

Salesforce Data Object Write Operation PropertiesThe Data Integration Service writes data to a Salesforce object based on the data object write operation. The Developer tool displays the data object write operation properties for the Salesforce data object in the Data Object Operation section.

You can view the data object write operation from the Input and Target properties.Input properties

Represent data that the Data Integration Service reads from an enterprise resource planning (ERP) system or a relational data object. Select the input properties to edit the port properties and specify the advanced properties of the data object write operation.

Target properties

Represent data that the Data Integration Service writes to Salesforce. Select the target properties to view data, such as the name, description, and the relationship of the Salesforce object.

Note: Information about rejected rows in a SOAP writer session is written to the Data Integration Service logs.

Input Properties of the Data Object Write OperationInput properties represent data that the Data Integration Service reads from an enterprise resource planning (ERP) system or a relational data object. Select the input properties to edit the port properties of the data object write operation. You can also specify advanced data object write operation properties to write data to Salesforce objects.

The input properties of the data object write operation include general properties that apply to the data object write operation. They also include port, source, and advanced properties that apply to the data object write operation.

You can view and change the input properties of the data object write operation from the General, Ports, Sources, and Advanced tabs.

Salesforce Data Object Write Operation Properties 25

Page 26: Informatica PowerExchange for Salesforce - 9.6.1 HotFix 3 ...kb.informatica.com/proddocs/Product Documentation/4/PWX_961HF3...documentation. 6

General PropertiesThe general properties list the name and description of the data object write operation.

Ports PropertiesThe input ports properties list the data types, precision, and scale of the data object write operation.

The following table describes the input ports properties that you must configure in the data object write operation:

Property Description

Name Name of the port.

Type Data type of the port.

Precision Maximum number of significant digits for numeric data types, or maximum number of characters for string data types. For numeric data types, precision includes scale.

Scale Maximum number of digits after the decimal point for numeric values.

Description Description of the port.

Sources PropertiesThe sources properties list the Salesforce objects in the data object write operation.

Advanced PropertiesThe advanced properties allow you to specify data object write operation properties to write data to Salesforce objects.

You can configure the following advanced properties in the data object write operation:

Property Description

Treat Insert as Upsert Upserts any record flagged as insert. By default, the Data Integration Service treats all records as insert.

Treat Update as Upsert Upserts any record flagged as update. Select this property when you use the Update Strategy transformation in the mapping to flag records as update.

Max Batch Size Maximum number of records the Data Integration Service writes to a Salesforce target in one batch. Default is 200 records.Not used in Bulk API target sessions.

Set Fields to NULL Replaces values in the target with null values from the source.By default, the Data Integration Service does not replace values in a record with null values during an update or upsert operation. The Data Integration Service retains the existing values.

26 Chapter 4: Salesforce Data Objects

Page 27: Informatica PowerExchange for Salesforce - 9.6.1 HotFix 3 ...kb.informatica.com/proddocs/Product Documentation/4/PWX_961HF3...documentation. 6

Property Description

Use Idlookup Field for Upserts

Uses the Salesforce idLookup field to identify target records that need to be upserted.If you do not select this property, use an external ID for the upsert operation. If you do not select this property and do not provide an external ID, the session fails.

Use this ExternalId/IdLookup field for Updates

The exact name of the external ID or idLookup field to use for updates.By default, the Data Integration Service uses the first external ID or idLookup field in the target. Use this property when you want to use a different field for updates.

Use SFDC Bulk API Uses the Salesforce Bulk API to load batch files containing large amounts of data to Salesforce targets.By default, the Data Integration Service uses the standard Salesforce API.

Monitor Bulk Job Until All Batches Processed

Monitors a Bulk API target session.When you select this property, the Data Integration Service logs the status of each batch in the session log. If you do not select this property, the Data Integration Service does not generate complete session statistics for the session log.

Override Parallel Concurrency with Serial

Instructs the Salesforce Bulk API to write batches to targets serially. By default, the Bulk API writes batches in parallel.

Enable Field Truncation Attribute

Allows Salesforce to truncate target data that is larger than the target field. When you select this property, Salesforce truncates overflow data and writes the row to the Salesforce target.

Enable Hard Deletes for Bulk API

Permanently deletes rows from Salesforce targets in a Bulk API target session.

Set the Interval for Polling Bulk Job Status

Number of seconds the Data Integration Service waits before polling Salesforce for information about a Bulk API target session.Enter a positive integer. By default, the Data Integration Service polls every 10 seconds.

Target Properties of the Data Object Write OperationThe target properties represent the data that is used to populate the Salesforce data object that you added when you created the data object. The target properties of the data object write operation include general and column properties that apply to the Salesforce objects. You can view the target properties of the data object write operation from the General, Column, and Advanced tabs.

General PropertiesThe general properties display the name and description of the Salesforce objects.

Salesforce Data Object Write Operation Properties 27

Page 28: Informatica PowerExchange for Salesforce - 9.6.1 HotFix 3 ...kb.informatica.com/proddocs/Product Documentation/4/PWX_961HF3...documentation. 6

Column PropertiesThe column properties display the data types, precision, and scale of the target property in the data object write operation.

You can view the following target column properties of the data object write operation:

Property Description

Name Name of the column

Type Native data type of the column property

Precision Maximum number of significant digits for numeric data types, or maximum number of characters for string data types. For numeric data types, precision includes scale.

Scale Maximum number of digits after the decimal point for numeric values

Primary Key Determines whether the column property is a part of the primary key

Description Description of the column property

Advanced PropertiesThe advanced properties displays the physical name of the Salesforce objects.

Importing a Salesforce Data ObjectImport a Salesforce data object to read data from a Salesforce object.

1. Select a project or folder in the Object Explorer view.

2. Click File > New > Data Object.

3. Select Salesforce Data Object and click Next.

The New Salesforce Data Object dialog box appears.

4. Enter a name for the data object.

5. Click Browse next to the Location option and select the target project or folder.

6. Click Browse next to the Connection option and select the Salesforce connection from which you want to import the Salesforce object.

7. To add an object, click Add next to the Selected Resource(s) option.

The Add Resource dialog box appears.

8. Select a Salesforce object. You can search for it or navigate to it.

• Navigate to the Salesforce object that you want to import and click OK.

• To search for the Salesforce object, enter the name of the Salesforce object you want to add. Click OK.

9. If required, add additional objects to the Salesforce data object.

You can also add objects to a Salesforce data object after you create it.

28 Chapter 4: Salesforce Data Objects

Page 29: Informatica PowerExchange for Salesforce - 9.6.1 HotFix 3 ...kb.informatica.com/proddocs/Product Documentation/4/PWX_961HF3...documentation. 6

10. Click Finish.

The data object appears under Physical Data Objects in the project or folder in the Object Explorer view.

Creating a Salesforce Data Object Read or Write Operation

You can add a Salesforce data object read or write operation to a mapping or mapplet as a source. You can create the data object read or write operation for one or more Salesforce data objects.

Before you create a Salesforce data object read or write operation, you must create at least one Salesforce data object.

1. Select the data object in the Object Explorer view.

2. Right-click and select New > Data Object Operation.

The Data Object Operation dialog box appears.

3. Enter a name for the data object read or write operation.

4. Select Read or Write as the type of data object operation.

5. Click Add.

The Select Resources dialog box appears.

6. Select the Salesforce object for which you want to create the data object read or write operation and click OK.

7. Click Finish

The Developer tool creates the data object read or write operation for the selected data object.

Creating a Salesforce Data Object Read or Write Operation 29

Page 30: Informatica PowerExchange for Salesforce - 9.6.1 HotFix 3 ...kb.informatica.com/proddocs/Product Documentation/4/PWX_961HF3...documentation. 6

C H A P T E R 5

Salesforce MappingsThis chapter includes the following topics:

• Salesforce Mappings Overview, 30

• Salesforce Mapping Read Example, 30

• Salesforce Mapping Write Example, 31

Salesforce Mappings OverviewAfter you create a Salesforce data object read or write operation, you can develop a mapping.

You can define the following objects in the mapping:

• Salesforce data object read operation as the input to read data from Salesforce metadata.

• Relational, flat file, or any supported data object as the output.

• Relational, flat file, or any supported data object as the input.

• Salesforce data object write operation as the output to write data to Salesforce data objects.

Validate and run the mapping to read data from Salesforce sources, and write to a Salesforce object.

Salesforce Mapping Read ExampleYour organization needs to migrate real-time sales opportunity information from a Salesforce system that is used by the sales team to a relational data source that is used internally by the executive sales management team.

Create a mapping that reads opportunity information in real time and writes those records to a table.

You can use the following objects in a Salesforce mapping:

Mapping Input

The mapping source is a Salesforce data object that contains the Opportunity object. Add the Opportunity object to the physical data object.

30

Page 31: Informatica PowerExchange for Salesforce - 9.6.1 HotFix 3 ...kb.informatica.com/proddocs/Product Documentation/4/PWX_961HF3...documentation. 6

Create a data object read operation and set the time limit and flush interval for change data capture in the data object read operation. Add the data object read operation to the mapping.

CDC TIME Limit=300Flush interval=60

Mapping Output

Add a relational data object to the mapping as an output.

After you run the mapping, the Data Integration Service writes the extracted opportunity information to the target table. Sales managers can use the information to track sales opportunities.

Salesforce Mapping Write ExampleYour organization needs to update real-time sales order processing status from an ERP system that is used by the logistics team to a Salesforce system that was used to create the order.

Create a mapping that reads real-time sales order processing status from the ERP system, and writes those records to Salesforce.

You can use the following objects in a Salesforce mapping:

Mapping Input

Add a relational data object to the mapping as an input.

Mapping Output

Add a Salesforce data object write operation to the mapping as an output.

The mapping target is a Salesforce data object that contains the Order object. Add the Order object to the physical data object.

Create a data object write operation and specify the update strategy in the data object write operation. Add the data object write operation to the mapping.

After every mapping run, the Data Integration Service writes the extracted order status information to the target table. Sales managers can use the updated information to track sales orders.

Salesforce Mapping Write Example 31

Page 32: Informatica PowerExchange for Salesforce - 9.6.1 HotFix 3 ...kb.informatica.com/proddocs/Product Documentation/4/PWX_961HF3...documentation. 6

C H A P T E R 6

Salesforce Run Time ProcessingThis chapter includes the following topics:

• Salesforce Run-time Processing Overview, 32

• Filtering Source Data by Using the SOQL Filter Condition, 32

• Filtering Source Data by Using the Informatica Filter Condition, 33

• Pushdown Optimization, 33

• Capturing Deleted and Archived Salesforce Records, 34

• Capturing Changed Data, 34

• Enable Bulk Query, 36

• Use SFDC Bulk API, 37

• Configuring the Upsert Target Operation, 37

• Configuring the Maximum Batch Size, 38

• Handling Null Values in Update and Upsert Operations, 38

• Override an External ID with an idLookup for Upserts, 38

Salesforce Run-time Processing OverviewWhen you develop a Salesforce mapping, you define the data object operation read or write properties. The data object read operation determines how the Data Integration Service reads data from Salesforce sources, and the data object write operation determines how the Data Integration Service writes data to Salesforce targets.

Filtering Source Data by Using the SOQL Filter Condition

When you configure a mapping that reads data from a Salesforce source, you can enter a filter condition to filter records read from the source. When you enter a filter condition, the Data Integration Service adds the

32

Page 33: Informatica PowerExchange for Salesforce - 9.6.1 HotFix 3 ...kb.informatica.com/proddocs/Product Documentation/4/PWX_961HF3...documentation. 6

WHERE clause to the SOQL query and generates an SOQL query based on the objects and fields included in the Salesforce source.

To filter records from a Salesforce source, set the SOQL filter condition in the data object read operation. For example, enter the following filter condition to read records from the Salesforce Account object that were created before October 30, 2012:

CreatedDate < '2012-10-30T00:00:00.000Z'Enter a filter condition based on the SOQL syntax defined in the Salesforce documentation. The Salesforce API validates the SOQL syntax at run time. If you enter a filter condition that is not valid, the mapping fails.

Filtering Source Data by Using the Informatica Filter Condition

When you specify a source filter, the Data Integration Service adds a WHERE clause to the default query. The Informatica filter condition uses the AND operator to combine multiple filter conditions specified through the Expression Editor.

To filter records from a Salesforce object, specify the conditions in the Expression Editor of a data object read operation. You can access the Expression Editor from the Query tab of a Salesforce data object read operation. You can filter records that have decimal, integer, and string data types.

For example, if you want to filter records for the Account object where the billing state is California, specify the following filter condition in the Expression Editor:

Account.BillingState=CANote: When you use both SOQL filter condition and Informatica filter condition, the Data Integration Service applies only the Informatica filter condition, and ignores the SOQL filter condition.

Pushdown OptimizationThe Data Integration Service can push Filter transformation logic to Salesforce sources. The amount of Filter transformation logic that the Data Integration Service can push to the source depends on the location of the Filter transformation in the mapping, the source type, and the Filter transformation logic.

The Data Integration Service translates the transformation expression into a query by determining equivalent operators and functions in the application. If there is no equivalent operator or function, the Data Integration Service processes the transformation logic.

Pushdown Optimization ExpressionsThe Data Integration Service can push Filter transformation logic to Salesforce sources for expressions that contain a column name, an operator, and a literal string. When the Data Integration Service pushes transformation logic to Salesforce, the Data Integration Service converts the literal string in the expressions

Filtering Source Data by Using the Informatica Filter Condition 33

Page 34: Informatica PowerExchange for Salesforce - 9.6.1 HotFix 3 ...kb.informatica.com/proddocs/Product Documentation/4/PWX_961HF3...documentation. 6

to a Salesforce data type. Filter transformation expressions can include multiple conditions that are separated by AND or OR.

You can use simple expressions with pushdown optimization. The following are examples of the syntax of simple expressions in lex format:

• <ComparisonOperator> ::= >|=|<|!=|<=|>= …

• <SINGLE_EXPR> ::= <Port> <ComparisonOperator> <Value>

• <AND_EXPR> ::= <SINGLE_EXPR> AND { <SINGLE_EXPR>| <AND_EXPR>}

• <OR_EXPR> ::= <SINGLE_EXPR> OR {<SINGLE_EXPR>| <OR_EXPR>}

• <SIMPLE_EXPRS> ::= <AND_EXPR> | <OR_EXPR>

For example, the following query Name="Peter" AND Age>30, retrieves all records with the name Peter and age above 30.

Note: Expressions that contain functions and parentheses are not supported.

Pushdown Optimization Rules and GuidelinesConsider the following rules and guidelines when you use pushdown optimization:

• When you use Filter expression and Filter transformation, the Data Integration Service applies both the filters.

• When you use both SOQL filter condition and Filter transformation, the Data Integration Service applies only the Filter transformation, and ignores the SOQL filter condition.

• When you use the Informatica filter condition, Filter transformation, and SOQL filter query, the Data Integration Service applies only the Informatica filter condition and Filter transformation. The SOQL filter condition is ignored.

• You cannot push transformation logic that contains the Date/Time fields.

Capturing Deleted and Archived Salesforce RecordsThe Data Integration Service can capture active, deleted, and archived records from a Salesforce object. By default, mappings do not capture deleted and archived records.

To capture deleted and archived records, configure the Use queryAll data object read operation property.

Capturing Changed DataThe Data Integration Service can capture changed data from a Salesforce object that is replicateable and contains the CreatedDate and SysModstamp fields.

If you configure a data object operation to capture changed data from a Salesforce object that is not replicateable or does not contain the CreatedDate and LastModifiedDate fields, the mapping fails. For more information about replicateable objects, see the Salesforce documentation.

You can capture changed data for a specific time period. Configure a mapping to capture changed data during a particular time period when the data changes.By default, change data capture is disabled. To enable a particular CDC method, specify the required attributes in the data object operation read properties. Configure the attributes for one CDC method. If you

34 Chapter 6: Salesforce Run Time Processing

Page 35: Informatica PowerExchange for Salesforce - 9.6.1 HotFix 3 ...kb.informatica.com/proddocs/Product Documentation/4/PWX_961HF3...documentation. 6

configure properties for both CDC methods, the Data Integration Service captures changed data continuously.

If the record is inserted and updated during the same CDC interval, PowerExchange for Salesforce reads two records with two row types, such as insert and update. Use the row type to determine update strategy.

By default, the SystemModstamp is the time stamp that determines when a Salesforce record was last modified.

Continuous CDC MappingWhen the Data Integration Service runs a mapping configured for continuous CDC, it reads records that were created, modified, or deleted in the specified time and passes them to the next transformation as rows flagged for insert, update, or delete. The Data Integration Service reads records from the Salesforce server time or at the CDC Start time specified in the data object read operation.

The Data Integration Service completes the following tasks to capture changed data for a continuous CDC session:

• Reads all records created and passes them to the next transformation as rows flagged for insert.

• Reads all records updated and passes them to the next transformation as rows flagged for update.

• Reads all records deleted and passes them to the next transformation as rows flagged for delete.

After the Data Integration Service finishes reading all changed data, the flush interval starts again. The Data Integration Service stops reading from Salesforce when the CDC time limit ends.

For example, you set the CDC time limit to 60 minutes and the flush interval to five minutes. After the Data Integration Service reads the data, the flush interval begins. The Data Integration Service captures changed data after each five minute flush interval. The Data Integration Service stops reading from Salesforce after 60 minutes.

CDC Flush Interval OffsetThe CDC flush interval offset is the number of seconds that you want to offset the CDC flush interval.

Configure the flush interval offset to capture real-time data that is submitted within the CDC time limit but not committed by Salesforce within the time limit. A delay might occur when the Salesforce needs to process automatic triggers before it commits the data.

When you configure the session to use a flush interval offset, the PowerCenter Integration Service subtracts the flush interval offset from the flush interval.

For example, you set the flush interval to 300 seconds, and you set the flush interval offset to 2 seconds. The first flush interval starts at 9:00:00 and ends at 9:04:58. Without the flush interval offset, it would have ended at 9:05:00. The second flush interval starts at 9:04:59 and ends at 9:09:57. The third flush interval starts at 9:09:58 and ends at 9:14:56.

Configure a Continuous CDCComplete the following tasks to capture changed data continuously for mappings that read from replicateable Salesforce objects:

Set the following properties in the data object read operation for continuous change data capture.

• CDC Time Limit and CDC Flush Interval, or

Capturing Changed Data 35

Page 36: Informatica PowerExchange for Salesforce - 9.6.1 HotFix 3 ...kb.informatica.com/proddocs/Product Documentation/4/PWX_961HF3...documentation. 6

• CDC Start Timestamp and CDC Time Limit with CDC Flush Interval

Note: The Data Integration Service reads the records from the CDC Start Timestamp specified till the Salesforce server time. After the records are read, continuous CDC starts based on the CDC Time Limit and CDC Flush Interval specified.

Time-Period Based CDC MappingWhen the Data Integration Service runs a CDC mapping for a specific time period, it reads all records in the data object and extracts the records that meet the CDC time period criteria.

The Data Integration Service completes the following steps to capture changed data for a time-period based CDC session:

• Reads all records created between the CDC start time and end time, and passes them to the next transformation as rows flagged for insert.

• Reads all records updated between the CDC start time and end time, and passes them to the next transformation as rows flagged for update.

• Reads all records deleted between the CDC start time and end time, and passes them to the next transformation as rows flagged for delete.

Configuring a Time-Period Based CDCTo enable change data capture for a specific time period, define the CDC Start Timestamp and CDC End Timestamp for the time period in the data object operation read properties.

Rules and Guidelines for Processing a Time-Period Based CDC Mapping

Consider the following rules and guidelines when you run a mapping with CDC for a particular time period:

• The Data Integration Service validates the formats of the start and end times when you run the mapping. If either timestamp format is wrong, the mapping fails.

• The values for the start and end times must be in the past.

• The start time must predate the end time.

• You cannot run the mapping continuously.

Enable Bulk QueryThe Data Integration Service can read data from Salesforce sources using the Salesforce Bulk API. Use the Bulk API to read large amounts of data from Salesforce while generating a minimal number of API calls.

With the Bulk API, each batch of data can contain up to approximately 1 GB of data in CSV format. When the Data Integration Service creates a batch, it adds any required characters to properly format the data, such as adding quotation marks around text.

You can also monitor the progress of batches in the log file.

To configure a mapping to use the Salesforce Bulk API, select the Enable Bulk Query data object operation read property.

36 Chapter 6: Salesforce Run Time Processing

Page 37: Informatica PowerExchange for Salesforce - 9.6.1 HotFix 3 ...kb.informatica.com/proddocs/Product Documentation/4/PWX_961HF3...documentation. 6

Note: Bulk read mode ignores the queryAll option.

Use SFDC Bulk APIThe Data Integration Service can use the Salesforce Bulk API to write data to Salesforce objects. Use the Bulk API to write large amounts of data to Salesforce with a minimal number of API calls. You can use the Bulk API to write data to Salesforce targets with Salesforce API version 30.0 or later.

With a Bulk API write, each batch of data can contain up to 10,000 records or one million characters of data in CSV format. When the Data Integration Service creates a batch, it adds required characters such as, quotation marks around text, to format the data.

You can configure a Bulk API target session to load batches serially or in parallel. By default, the data load is in parallel mode, but you can override the data load to serial mode. You can also monitor the progress of batches in the session log.

To configure a session to use the Bulk API for Salesforce targets, select the Use SFDC Bulk API session property. When you select this property, the Data Integration Service ignores the Max Batch Size session property.

Configuring the Upsert Target OperationThe Salesforce upsert operation creates a new record or updates an existing record in a Salesforce object. You must provide one of the following types of fields to upsert records to a Salesforce object:

External ID field

You can use a custom Salesforce field to uniquely identify each record in a Salesforce object. You can create a custom external ID field for each object in Salesforce. You can view the properties of a Salesforce object to check whether the object includes an external ID field.

idLookup field

You can use a Salesforce idLookup field to identify each record in a Salesforce object. Salesforce creates idLookup fields for each standard Salesforce object. For example, the Email field is an idLookup field for the Contact object. Custom Salesforce objects do not contain an idLookup field. For more information about idLookup fields, see the Salesforce documentation.

A Salesforce target object might have multiple external ID or idLookup fields. By default, the Data Integration Service uses the first external ID or idLookup field it encounters. However, you can specify the external ID or idLookup field to use for the upsert operation in the run-time properties.

To configure the upsert operation to write to a Salesforce object, perform the following steps:

1. Map the external ID or idLookup field from the source to the target in the mapping. If you are using an external ID, map the external ID to the external ID field in the Salesforce object. If you are using an idLookup field, map the field to the appropriate target field. For example, map the email source field to the Email field in the Salesforce Contact object.

2. Configure the Treat Insert as Upsert or Treat Update as Upsert run-time property to configure a Salesforce run-time property to upsert records.

Use SFDC Bulk API 37

Page 38: Informatica PowerExchange for Salesforce - 9.6.1 HotFix 3 ...kb.informatica.com/proddocs/Product Documentation/4/PWX_961HF3...documentation. 6

3. To use the idLookup field instead of an external ID field, enable the Use idLookup Field for Upserts run-time property. By default, the Data Integration Service uses an external ID for upserts. You can configure the run-time property to override the external ID and use the idLookup, instead.

4. To specify an external ID or idLookup field, enter the external ID or idLookup field name in the Use this ExternalId/idLookup Field for Upserts run-time property.

Note: If you do not enter the name of the external ID or idLookup field, the Data Integration Service selects the first external ID or idLookup field it encounters. If you specify a field that is not an external ID or idLookup field, or if you misspell the field name, the run-time property fails.

Configuring the Maximum Batch SizeThe Data Integration Service writes data to a Salesforce target as a batch. The Max Batch Size attribute in the session properties determines the maximum number of records the Data Integration Service can write to a Salesforce target in a batch. The Salesforce service can receive a maximum of 200 records in a single insert, update, or delete operation.

To minimize the number of calls made to the Salesforce service, each batch must accommodate the maximum number of records as configured in the Max Batch Size property.

Handling Null Values in Update and Upsert Operations

By default, the Data Integration Service does not replace existing values in a Salesforce record with null values from the source during an update or upsert operation. To replace existing values with null values, configure the Set Fields to NULL session property for the Salesforce target.

You cannot set the value of an external ID field in a Salesforce target to NULL. The session fails if you enable the Set Fields to NULL session property and the session tries to replace the value in an external ID field with a null value.

Override an External ID with an idLookup for UpsertsThe Data Integration Service can use the external ID or idLookup fields when performing an upsert operation to identify records in a Salesforce target. By default, the Data Integration Service uses the external ID field for upserts. You can configure the session to override the external ID field and use the idLookup field, instead.

38 Chapter 6: Salesforce Run Time Processing

Page 39: Informatica PowerExchange for Salesforce - 9.6.1 HotFix 3 ...kb.informatica.com/proddocs/Product Documentation/4/PWX_961HF3...documentation. 6

A P P E N D I X A

Data Type ReferenceThis appendix includes the following topics:

• Data Type Reference Overview, 39

• Salesforce Data Types and Transformation Data Types, 39

Data Type Reference OverviewThe Developer tool uses the following data types in PowerExchange for Salesforce mappings.

Salesforce native data types

Salesforce native data types appear in the physical data object column properties.

Transformation data types

Set of data types that appear in the transformations. They are internal data types based on ANSI SQL-92 generic data types, which the Data Integration Service uses to move data across platforms. Transformation data types appear in all transformations in a mapping.

When the Data Integration Service reads source data, it converts the native data types to the comparable transformation data types before transforming the data. When the Data Integration Service writes to a target, it converts the transformation data types to the comparable native data types.

Salesforce Data Types and Transformation Data Types

The following table lists the Salesforce data types that Data Integration Service supports and the corresponding transformation data types

Salesforce Data Type Range and Description Transformation Data Type

AnyType Polymorphic data type that returns string, picklist, reference, boolean, currency, integer, double, percent, ID, date, datetime, URL, or email data

String

Base64 Base64 encoded binary data String

39

Page 40: Informatica PowerExchange for Salesforce - 9.6.1 HotFix 3 ...kb.informatica.com/proddocs/Product Documentation/4/PWX_961HF3...documentation. 6

Salesforce Data Type Range and Description Transformation Data Type

Boolean Boolean (true/false) values Integer

Byte A set of bits String

Combobox Enumerated values String

Currency Currency values Decimal

DataCategoryGroupReference

Types of category groups and unique category names

String

Date Date values Date/Time

DateTime Date and time values Date/Time

Double Double values Decimal

Email Email addresses String

Encrypted String Encrypted text fields contain any combination of letters, numbers, or symbols that are stored in encrypted form

String

ID Primary key field for a Salesforce object String

Int Fields of this type contain numbers with no fraction portion

Integer

Master record ID of the merged record String

Multipicklist Multiple-selection picklists, which provide a set of enumerated values from which you can select multiple values

String

Percent Percentage values Decimal

Phone Phone numbers String

Picklist Single-selection picklists, which provide a set of enumerated values that you can select one value from

String

Reference Cross-references to another Salesforce object

String

String Character strings String

Textarea String that appears as a multiple-line text field

String

Time Time values Date/Time

URL URL values String

40 Appendix A: Data Type Reference

Page 41: Informatica PowerExchange for Salesforce - 9.6.1 HotFix 3 ...kb.informatica.com/proddocs/Product Documentation/4/PWX_961HF3...documentation. 6

Index

Aadvanced properties

input 26

Bbatch size

description for Salesforce 38bulk API 36Bulk API target session

configuring for Salesforce 37

Ccapture deleted and archived records 34capturing changed data 34CDC 34CDC Flush Interval Offset

description for Salesforce 35column properties 23configure

time-period based CDC 36configuring for Salesforce

Bulk API target sessions 37configuring HTTP proxy options

Developer tool 14creating

Salesforce connection 19Salesforce data object read operation 29

custom Salesforce objects 20

Ddata object read operation

creating 29datatype reference overview 39DTM Buffer Size

configuring for Salesforce 38

Eexternal ID

description 37overriding with idLookup 38

Ggeneral properties

input 26

IidLookup

description for Salesforce 37overriding the external ID 38

importing Salesforce data object 28

input properties 25installation

overview 11installing

client component 14server component 12server component on UNIX 13server component on Windows 12

Mmapping output 30, 31

Nnulls

handling in upserts and updates 38

Ooverview

Salesforce data object 20

Pperformance

configuring buffer block size for Salesforce 38PowerExchange for Salesforce

overview 9process

continuous CDC 35properties

Salesforce data object 21Salesforce data object read operation 22

Rrelated objects 21rules and guidelines

related objects 21time-period based CDC 36

run-time processing overview 32

41

Page 42: Informatica PowerExchange for Salesforce - 9.6.1 HotFix 3 ...kb.informatica.com/proddocs/Product Documentation/4/PWX_961HF3...documentation. 6

Ssalesforce

configure continuous CDC 35continuous CDC 35prerequisites 11

Salesforce mapping example 30, 31mapping input 30, 31mapping overview 30

Salesforce connection creating 19overview 16properties 16

Salesforce data object importing 28

Salesforce data object read operation creating 29properties 22

session conditions DTM Buffer Size for Salesforce 38

Set Fields to NULL session property 38

source properties 22standard Salesforce objects 20

Ttime-period based CDC 36

Uupsert

configuring for Salesforce 37description for Salesforce 37external ID 37overriding external ID with idLookup 38Salesforce idLookup field 37Salesforce session configuration 37

use query all 34

42 Index