Blog posts on Salesforce, Java, .Net, PHP, Heroku and many more
I have read many posts and watched video to understand Microservices precisely however I found Martin Fowler’s explanation about Microservices most helpful. This blog post is just the recap & summary of what I understood about Microservices Architecture.
Characteristics of Microservices
Build services in form of Components
Components can be independently replaceable and upgradable
Components can be combination of Libraries and Services
Services can be built in other languages and services can inter communicate
Organized keeping business rules in mind
Traditionally (Monolithic), Services were organized considering technical aspects like different services related to UI, Database, Server etc
Microservices, suggests to group it as per business capabilities like shipping, Order, Catalog etc
Smart end points and dump pipe
In ESB (aka spaghetti box 😉 lol), we tend to add all smartness in ESB itself and endpoint is just a dump where consumer gets preprocessed data
MicroServices on other hand encourages dump pipe (ESB) and smart endpoints
Decentralized Governance or Data Governance
Every Service should be responsible for their own database & persistence
Can’t communicate to other databases directly, it should be via API’s only (These are mostly inspired by Amazon’s 2 Pizza team size)
Every service can have different languages or tools
Continuous Delivery is very important for each services to make sure there is no or minimal down time
Top class monitoring capabilities to perform analysis of degraded performance or downtime
Important to have roll back plan and ability to spin up new server in case of service or service fail
Design for failure
As there could be many microservices, its inevitable that they would fail.
Companies like Netflix, they have a application (chaos monkey) which randomly goes out and fail their microservices deliberately
Its important to perform these kind of exercises to understand how resilient their network and microservices are.
Improve Lightning Component performance using simple 15 rules like Storable Actions, avoiding server trips, Lightning Data Service, Unidirectional data binding, creating component APIs etc
Avoid Server Trips
Most obvious idea to improve Lightning Component Performance is to avoid server trips. Let’s say, you need to know the queue Id to be assigned as owner in Case and also need custom setting information to derive the behavior of Lightning Component. There are two ways to achieve this – Call Apex Controller two times vs return combined results from Apex in single call and process JSON in client side controller of Lightning Component.
2. Use Storable Action
In this approach, Lightning component shows cached result instead of making immediate server trip. Lightning component will make server (Apex) call in background and if cached result is stale, then it would cache and refresh the lightning component. This is very useful for devices which has slow internet connections. If you are Facebook or Google News user, you would be easily relate it. When we open these apps, it shows previous feed and if there are new feeds, it gives us option to refresh view or automatically refresh it. All you have to do is, on client side controller of Lightning component, mark action as storable using this code action.setStorable(). This blog post explains working of storable action in detail. Continue reading “15 ways to improve performance of Lightning Components in Salesforce”
Short & quick note about Salesforce Identity product
Below are important summary about Identity Connect
It comes as add on feature with Salesforce with additional cost
Only works with Active Directory
Its only one way Sync, from Active Directory to Salesforce
We can assign profile, role and permission set to user using Identity Connect
Any changes made manually for mapped field on user record would be overwritten with next sync.
Sync from Active directory to Salesforce can be realtime or scheduled
If the user is deactivated in Active Directory then user record also gets deactivated. Identity Connect internally uses API to deactivate user. Unlike for some other SSO solutions, if user is deactivated in Active directory then they cannot login to Salesforce. However, if they already logged into mobile phone or connected app then they can still access Salesforce. This problem is resolved in Identity Connect.
Identity connect is installed on client network behind the firewall. Identity connect pushes the changes to Salesforce from client’s network.
If we want to use Identity Connect as a SSO and user wants to use Salesforce from outside company network or on mobile phone, then its login page must be accessible on internet. This can be done by installing Identity Connect on De-militarized Zone (DMZ).
Identity connect is used for User provisioning but not for Just in Time (JIT) provisioning.
We can use Identity connect as a SSO. If customer already has SSO implemented then Identity connect can only be used for user provisioning.
One Identity Connect can be used for multiple Salesforce instances however all production or all sandboxes. If you want to use Identity connect for production and Sandbox at same time, then we would need two Identity Connect, one for Sandbox and other for Production.
Identity Connect will work with only one Active Directory but it can have multiple domains in same AD.
Integrated Windows Authentication (IWA) is supported by Identity Connect using Kerberos authentication protocol. Means, if user is already logged into company provided windows system, then login screen would be bypassed and Salesforce login experience would be seem less.
Scheduled sync uses more API’s then realtime schedule. Because, Schedule sync checks for changes in all Salesforce users vs all AD users.
If you are new to Microsoft Azure, you can get free trial access however you might need to provide Credit card details to use few features. You would not get charged because we get $200 worth credit for new Account that can be used in a span of year.
I was not able to use Azure’s Active Directory SSO for Just in Time (JIT) provisioning. Rather, it connects to Salesforce and creates user whenever user is provisioned in Active Directory, just like Identity Connect
Security token is mandatory. In case if you have IP login range then we don’t get Security token. To fix this, we can divide our password to have some value in Security token. As final password anyways is Password + Security Token. Shown in below image
When we assign any user to Enterprise application (in our case its Salesforce), we need to map profile to the user.
How to use Heroku Postgres Database from any third party application or local server
Heroku is a Platform as a Service (PaaS) provided by Salesforce and one of my favorite place to jump and spin off any third party application, which can work seamlessly with Salesforce. Heroku also provides free PostgreSQL database which can be used by your application.
Sometimes you may be in need to use this cloud based free PostgreSQL database offered by Heroku in third party application or your local Server.
I was able to do it very quickly and easily and want to make sure would not forget in future, so here is the post. Complete source of this blog post can be found on my Github repository.
One important point to note here, local server must support SSL i.e. https. I have written some posts in past to show how SSL can be enabled in tomcat server or nodejs. This time I wanted to check how it can be done in Docker.
I always use Docker if I need to use Jenkins or any other server. Instead of maintaining all servers individually, it is easy and convenient to use container like docker and control servers or applications from there.
Note : Everything we are discussing here is completely covered as part of Live coding in Apex Salesforce Saturday organized by Amit. Would like to thank Mohith for his support and answers during the demo. It also shows How SalesforceDX can be used daily by developers to perform development.
Introduction to Jasmine framework and getting started with Behavior Driven Development (BDD) testing, along with Complete Source code and Video
Majorly, Jasmine is made up of three functions
Like standard frameworks, it also has setup method and tear down methods
Complete Source code and demo of implementing Custom Apex Adapter for Salesforce Connect
As you might already know, using Salesforce Connect, we can display external data in Salesforce without physically creating record. Before Salesforce Connect, we had few options like Visualforce, Canvas etc. There are few options available in Salesforce Connect to show data like using protocols OData 2 or OData 4, cross org adapter , custom Apex adapter etc.
There could be scenario, like you already have license for Salesforce connect and want to use it to expose external data inside Salesforce. If you think about any custom solution using Lightning component or Visualforce, there could be many considerations and most important would be displaying data on user interface.
Using Salesforce Connect in above scenario will cut down your most of effort. Your data would be exposed as External Object and you can use it just like custom object. Just imagine, how cool it would be that you would not need to write a single line of code for data presentation.
Now, here your challenge comes. It is pretty much possible that external data does not support OData protocol. And you don’t have necessary middleware tool available to perform transformation and expose it as OData. It doesn’t mean that you cannot use Salesforce Connect. One of the feature of Salesforce Connect is writing and using Custom Adapter using Apex.Continue reading “Implementing Custom Apex Adapter for Salesforce Connect”
How to create a lookup field in Salesforce External Object
This is going to be very short note for readers and pretty much possible that you already know about problem and its solution. However, lets discuss problem first.
On External Object, we do get an option to create a lookup field. It looks very straight forward first, that create a lookup field and populate value in it. Below image shows, what happens when you try to populate newly created lookup field on External Object.
Did you observed, that Lookup field on External Object does not retain any value in Salesforce? Salesforce should have given some proper error message either while creating this field or after updating record.
What is External Object in Salesforce ? This is kind of virtual object which does not exists in your Org. So, you created a field on Object which does not exists and then tried to populate it, what else you could have imagined ?
Resolution of this situation is very easy and would make lot of sense. Same solution applies to External Lookup field as well in Salesforce. All you have to do is, create a new field in Source system of type text first (length 18) and Sync your External Data Source so that new field will appear in your / destination Org. Now edit field’s data type to lookup or External lookup and it will work as expected.