Friday 17 April 2015

Writing Apex Scheduled Batch class - Framework!

Basics:
1. Apex schedulers are implemented to do data/logical processing at a given time.

2. Apex batch classes are implemented to process LDV (Large Data Volume) for given objects.

3. For implementing a schedule class you need to implement "Schedulable" interface

4. For implementing a batch class you need to implement "Database.Batchable" interface

5. You can make a single class both Schedulable and Batchable so you don't need to have 2 separate class for batch and schedule. Kindly see below code for this.

6. Standard batch size is 200 and it can be modified/changed when you invoke Database.executeBatch method to call batch class. (second parameter of this method is batchsize where first param is instance of class). Maximum value of size can be 2000 and minimum should be greater than ZERO.

7. Apex batch processing is "Asynchronous" mechanism so no guarantee on time of processing

8. When you schedule something at 7am for example, it is not guranteed that it will execute at same time. Your job actually pushed to Salesforce queue for same at 7am and Salesforce will pick it up based on resources available on Salesforce Cloud (Servers). So in nutshell, actual execution might be delayed based on service availability.

9. You can only have 100 scheduled Apex jobs at one time. You can evaluate your current count by viewing the Scheduled Jobs page in Salesforce and creating a custom view with a type filter equal to “Scheduled Apex”. You can also programmatically query the CronTrigger and CronJobDetail objects to get the count of Apex scheduled jobs.

10. When implementing test class for your scheduled batch class, you should have all assertions after Test.stopTest() as this is an asynchronous processing.

11. Governor limits for batch and scheduled apex applied separately. So, all scheduled apex limits apply for batch jobs scheduled using System.scheduleBatch. After the batch job is queued (with a status of "Holding" or "Queued"), all batch job limits apply and the job no longer counts toward scheduled apex limits.

Let's get on code! If you want to process Account records by a scheduled batch..

How your code framework should be for AccountBatch?


Some important considerations:
1. Your batch size ideally should fall under 50-100 but there is no standard rule for this. Batch size is really defined based on how many DML statements on different objects you have or how many callouts your are doing from your method?
2. The start(), execute(), and finish() methods can implement up to 10 webservice callouts each. Time for each callout is 120 seconds. (Documentation Link: https://www.salesforce.com/us/developer/docs/apexcode/Content/apex_batch_interface.htm)

But this limit is "100" now. You can make 100 callouts from a batch.

Serial Batch processing:
This is very common scenario where you need to execute batch classes on different object right one after the other. How can you achieve that? It is very simple; you can call next batch in finish() method of your current batch as finish method is actually for doing something after work done is committed by your batch.

What is Database.Stateful?
This is an interface used to maintainsstate across transactions. When using Database.Stateful, only instance member variables retain their values between transactions. Static member variables don’t and are reset between transactions. If you don’t specify Database.Stateful, all static and instance member variables are set back to their original values.

What is Apex Flex Queue?
It is used to submit up to 100 batch jobs without getting an error. The batch job is placed in the Apex flex queue, and its status is set to "Holding". This feature is released in Spring'15 and it is great improvement over Apex batch queue. 

Writing Apex Triggers - One trigger per object - Framework!

Apex triggers are one of the most essential part of Force.com platform for back end automation. Apex trigger code is always executed before or after the events insert, update, delete, undelete, merge (update + delete), upsert (update + insert)

When to use before triggers?
When you have a requirement to update the record on same object on which trigger is implemented. Before triggers are also implemented when there is any kind of custom validations to be done based on set of fields (compound key) or a field to show a custom message (addError).

When to use after triggers?
When you have a requirement to create/update/delete operation on other object than the object on which trigger is implemented. i.e. Create a Warranty__c record after opportunity is "Closed Won"

Very important rule to keep in mind while implementing triggers is "Only ONE trigger per Object"

3 ways to write Apex Triggers:

  1. Modular Programming Approach – Simplest
    • We use this 70% of times
  2. Factory Pattern (20%)
  3. Abstraction of Handlers (10%)
    • When there are multiple handlers for a object 
    • Sequencing of handlers is required in execution
    • Dynamic instance creation is required.
    • CustomSettings__c cs = CustomSettings__c.getInstance('Vehicle'); 
    • Type t = Type.forName(cs.className__c);
    • Vehicle v = (Vehicle) t.newInstance(); 


Standard batch size of any bulkified trigger is 200. But if required then you can split it in code as n < 200 batch size. So, let's get started with a robust, easy to control and easy to maintain framework for a trigger on Account object!

How my AccountTrigger code will look like?

How my AccountTriggerHandler code will be?

How my AccountTriggerActions code will be?

How my AccountTriggerTest code will be?

Some technical specifications about above code:
1. Trigger_Control__c : This is a Hierarchical custom setting object which is used to control the execution of Account trigger. At this moment, I am looking at OrgDefaults but you can code it to look at a specific profile/users custom settings. This is very useful when you are working on an enterprise scale project and want to migrate millions of records without keeping these triggers on (Historic Data migration). You can have a user like "API Connect" and configure Trigger_Control__c for him on those trigger fields like Account_Trigger__c by marking it false. Once done, you can turn this field on again!

2. Naming convention: Naming conventions I followed in above code are usually best way I found so far. It helps to see all components in one shot listed under each other and also variable naming convention helps not to keep all variables in mind while coding.


Advantages of above framework:
1. Precise control on order of execution inside a singleton trigger
2. Separation of concerns
3. Control over re-entrant code
4. Clear organization and structure

What happens when upsert / merge event occurs?
1. Upsert: Upsert triggers fire both before and after insert or before and after update triggers as appropriate.
2. Merge: Merge triggers fire both before and after Delete triggers for the losing records and Before update triggers for the winning record only.