10 Salesforce Integration design considerations from Architect point of view – Mind Mapping included

After working on multiple Salesforce implementation project as an Architect, its time to share what I learned from those implementations and would strongly suggest to be considered before designing any “Salesforce Integration”.

Below image shows “integration mind mapping” used by me. I use it to consider some major aspects while discussing integration approaches with enterprise architects in various meetings. This image is very high level however if you think some more points to be considered or have some other thoughts on same, please share.

Salesforce Integration Mind mapping diagram
Salesforce Integration Mind mapping diagram

1. Data Compliance

Lets start with Compliance part first. It is possible that client has confidential data and doesn’t want to move it to Cloud. In that case, we can expose existing web application of Client using some Salesforce products like “Canvas” or “Lightning Connect”.  For Lightning Connect, data needs to be available on DMZ layer however for Canvas it can exist on Intranet and therefore end users can access data only within office, providing flexibility of cloud and security of intranet.

2. ETL tool

As a best practice for Integration design, loading, transforming and extraction of data should be avoided within Salesforce. Salesforce is multi-tenant architect and therefore has many governor limits. I have seen many projects getting failed because of decision; not to use ETL tool. There may be many reason why ETL may not have been considered, like architect didn’t foresee much in future about its necessity or Client was short on budget. Even though ETL tool adds up some budget but then it opens endless possibilities.

If ETL is considered, then there are some very important question needs to be answered and designed.

a. What will happen if ETL or any System goes down

Software industry is full of possibilities and malfunctioning, no one can be sure about hardware and software design. Business continuity plan is very important to be considered. And this question must be considered in meetings and design “What will happen if ETL goes down” ? Simple answer is declaring clear “Accountability” between systems. Any application can go down or any application can face an error; it could be source System, middle ware or Salesforce. Defining clear accountability of system is very important. Who will be responsible to handle situation if ETL goes down ? Who will be responsible to handle situation if Salesforce goes down, or doesn’t respond in timely manner or some Apex or server error occurs ? We should be having proper error handling and logging mechanism between systems and backup plans as well. For example – If ETL goes down, maintain all request in some “Messaging Queue Server” between Source system and ETL. once ETL comes on-line, start processing queue.

b. Throttling capabilities

When more than two systems are involved in communication, not necessary that both system will perform same. Might be possible that Salesforce could be faster or slower on some unlucky days. We should consider this point as well in our design. Most ETL tool has capability of throttling to make request faster and slower on basis of error or network speed. Personally, I think this is very important in high traffic integration. In one of my project, we had to load (upsert) millions of data in Salesforce on weekly basis. Because of heavy design of Salesforce apex triggers and multiple field update workflows on same object, Salesforce performance was much slower than usual. There were no throttling capabilities and therefore there were lots of loss of data because of various error in Salesforce like “Concurrent Apex limit”, “Unable to lock rows” etc…

c. Priority of incoming and outgoing messages

If we consider “messaging” and “throttling” in our design then most obvious question needs to be considered in “priority of messages”. As message may be sitting in queue, it is possible that some messages should be processed near real time and very important for next system. We should be having logic in “ETL” or “Messaging queue” to decide priority on basis of some message criteria.

3. Fallout reporting capabilities

Either we are going with or without ETL tool, fallout reporting capabilities are very important in integration design. Fallout report may also be referred as “loss in data” report where few message or request has not be processed because of various errors between systems. If we skip this step, validating data accuracy could be very difficult and next to impossible. This may be considered as a part of error and log handling design.

4. API limit considerations

If external system is trying to connect to Salesforce then it is using API limit which is defined for every organization for a 24 hours. We also have a limit on Bulk API per 24 hours. My most of Salesforce implementation project has 3000+ Salesforce users and therefore I have not faced issue with API limit however good to consider in design.

5. Single ETL tool to connect to every applications

This point I know is very debating. I am saying that every application connecting to Salesforce should have single ETL tool where few of you may disagree with me. In one of my project, I had to integrate Salesforce with around 18 Systems. BOOMI was ETL tool of choice and it was used to integrate around 8 systems. Other Systems were integrated using webservices, custom Java tool which implemented BULK API and command line dataloader. We faced lots of issue maintaining and debugging error in all these tools. It would have been very easy if Single ETL tool could be used for all systems. As multiple tools were used, few tool were black box for us and they were using on there Will whenever they wanted to change data inside Salesforce. As discussed above, “Accountability” was main issue. If any tool was failing because of any reason, Salesforce team were to blame in every case and we were on fire 😀 .

6. Different user for different System

Lets say, if you are integrating with 10 systems using ETL tool, it is always best practice to have different users for all those system. It has lots of benefit, flexibility and control. We can have different profiles for each user, complete control and easy to debug. Imagine single “integration user” is used to interact with all system, chances of data skew is very high. We cannot switch off user from Salesforce end, hard to configure profile as it may break existing integration and many more.

7. Integration User license

As discussed above, its good practice to have different Salesforce user for different systems. It is not necessary that we need “full Salesforce user license”. If we are not using standard objects then cost can be saved by opting “Force.com app bundle license” or any other suitable Salesforce user license.

8. Data Volume over the time and Data archiving considerations

If you are integrating with multiple systems or not, Data archiving strategy is very important to be considered. There are many advantages of data archiving including improve in performance of reports, list views, global search and saving space of Salesforce. Data archiving can be done using ETL tool or even command line dataloader directly connecting to databases like SQL Server. While designing Data archive strategy, easily getting data back is also very important. Example – using Dataloader, if backup is done in multiple csv files, It would be hard to find record from hundreds of csv files. Even if RDBMS like SQL Server or MySQL is considered, indexing, partition and other performance improvement techniques must be considered. It should not be like one way road where we can archive data but cannot get it back when needed. I have used “DBamp” in many of projects where ETL tools were not used and would recommend to try it out.

9. Single Sign On

This is also very important topic and needs much attention and thoughtful discussion. Salesforce provides lots of option to implement Single Sign On using federated authentication, Delegated authentication or many other Authorization providers. In addition of all APIs available to implement SSO, Salesforce also provides “Identity Connect” which could be considered for SSO implementation. There are many third party applications as well like Ping Identity or Okta. Also, as a best practice, do not enable SSO for System admin users. If something happens with Identity provider or Web services are not responding, end users will be blocked and System admin could still log into system. If we enable SSO for System admin as well then complete Salesforce instance would not be accessible by anyone.

10.  Outbound messages are your best friend

Question is “Are you using or considering Outbound messages ?”. I have seen, many Salesforce implementations where integration could be easily done by using outbound messages instead of Apex callouts. You don’t need to write Apex code, Test classes and polling mechanism if communication fails. Outbound message tries until message is successfully is delivered though there are some design considerations while using Outbound messages.

Please feel free to leave comment and suggestions on this topic so that we can discuss each others experience in integration design and problems.

Posted

in

by


Related Posts

Comments

16 responses to “10 Salesforce Integration design considerations from Architect point of view – Mind Mapping included”

  1. Gaurav Kheterpal Avatar
    Gaurav Kheterpal

    Very well written Jitendra, Where/ how do you classify A) Batch vs. sync processing B) Native vs. custom reports vs. something as advanced as Wave analytics C) Salesforce1 readiness and cross-platform compatibility?

    1. Jitendra Zaa Avatar

      @gauravkheterpal:disqus , Reporting is good point to consider here (will include it). However Sync vs Async would be more inclined on business requirement and Mobile readiness as well. This post, I tried only Integration Architecture of Salesforce with external systems..

  2. Radnip Avatar
    Radnip

    I’ve been brought into project and seen exactly the same, when companies don’t want to pay for an ETL tool and the project goes bad… Then I point out there is perfectly good free ETL tools out there like Talend Open Studio, then the argument that they don’t like/agree with open source projects. Ok it may take a bit of time to learn but really?! Its worth it! TOTALLY agree we outbound messages, they are not used nearly as much as they should do. People just write Apex webservice calls when an outbound message could easily do the same job.

    1. Jitendra Zaa Avatar

      “Talend” is really a good tool. Have used in one of my project and impressed by it. coincidentally, this is my next targeted topic for blogging.

  3. Gautam Singh Avatar
    Gautam Singh

    Great Detailed Article Jitendra ! ETL indeed is a dark horse in Data Trails !

    I am digging deeper into ETL these days on account of designing the complete Integration Architecture for my Client. They’ve been using Cast-Iron Orchestrations which is working well with Staging Objects. The challenge here we face most of the time is “Accountability” which seems to be no where decided.

    Thanks for sharing your experience !

    1. Jitendra Zaa Avatar

      That’s right @singhgautam02:disqus, In integration design sometimes “Accountability” is more important than actual technical challenges.

  4. Manish Mishra Avatar
    Manish Mishra

    Hello Jitendra, I have a requirement, but i do’t know it is possible or not.
    On custom list view page(html5 or vf), we can preview attached doc,docx(resume) files, on mouse hover or onclick? Please suggest.

    1. Jitendra Zaa Avatar

      As per my knowledge its not possible and complex as well. You need rely on some appexchange or external webservice.

      1. Manish Mishra Avatar
        Manish Mishra

        ok, Thanks

  5. Abhinav Gupta Avatar

    Very good article Jitendra ! I believe next post is finding out right ETL tool based on price and other options. We are exploring Talend at our side, and looks good so far for FREE*

    1. Jitendra Zaa Avatar

      Thanks Abhinav, Next post is around Talend and comparing ETL tools are good idea 🙂

  6. Bennie Avatar
    Bennie

    Great one Jitendra, eagerly awaiting your post on TalenD

  7. Shalin Siriwardhana Avatar

    Informative article! But I think a concept map is not necessarily a mind map. A mind map focuses only on one concept while a concept map focuses on multiple concepts and ideas. This comprehensive concept map guide explains this in more detail.

  8. aliya Avatar

    Informative article concept is great it’s helpful for people also check our website

  9. Tanvi Avatar

    Thanks for sharing such a great post about salesforce integration & design….

  10. Rahul Avatar

    I am really happy to say it’s an interesting post to read .

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Discover more from Jitendra Zaa

Subscribe now to keep reading and get access to the full archive.

Continue Reading