DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Please enter at least three characters to search
Refcards Trend Reports
Events Video Library
Refcards
Trend Reports

Events

View Events Video Library

Zones

Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks

Because the DevOps movement has redefined engineering responsibilities, SREs now have to become stewards of observability strategy.

Apache Cassandra combines the benefits of major NoSQL databases to support data management needs not covered by traditional RDBMS vendors.

The software you build is only as secure as the code that powers it. Learn how malicious code creeps into your software supply chain.

Generative AI has transformed nearly every industry. How can you leverage GenAI to improve your productivity and efficiency?

Related

  • Accelerating Insights With Couchbase Columnar
  • Transforming Data Analytics by Combining SQL and ML
  • How Doris Connects to Multiple Databases for Federated Analytics and Data Migration
  • Efficient Long-Term Trend Analysis in Presto Using Datelists

Trending

  • A Guide to Auto-Tagging and Lineage Tracking With OpenMetadata
  • Advancing Your Software Engineering Career in 2025
  • Navigating Change Management: A Guide for Engineers
  • Enhancing Business Decision-Making Through Advanced Data Visualization Techniques
  1. DZone
  2. Software Design and Architecture
  3. Integration
  4. How To Optimize the Salesforce CRM Analytics Dashboards Using SAQL

How To Optimize the Salesforce CRM Analytics Dashboards Using SAQL

Explore Salesforce Analytics Query Language (SAQL), known for facilitating CRM data exploration, transformation, and analysis.

By 
Kapil Kumar Sharma user avatar
Kapil Kumar Sharma
·
May. 10, 24 · Tutorial
Likes (2)
Comment
Save
Tweet
Share
3.0K Views

Join the DZone community and get the full member experience.

Join For Free

Salesforce Analytics Query Language (SAQL) is a Salesforce proprietary query language designed for analyzing Salesforce native objects and CRM Analytics datasets. SAQL enables developers to query, transform, and project data to facilitate business insights by customizing the CRM dashboards. SAQL is very similar to SQL (Structured Query Language); however, it is designed to explore data within Salesforce and has its own unique syntax which is somewhat like Pig Latin (pig-ql).

You can also use SAQL to implement complex logic while preparing datasets using dataflows and recipes. 

Key Features

Key features of SAQL include the following:

  • It enables users to specify filter conditions, and group and summarize input data streams to create aggregated values to derive actionable insights and analyze trends.
  • SAQL supports conditional statements such as IF-THEN-ELSE and CASE. This feature can be used to execute complex conditions for data filtering and transformation.
  • SAQL DATE and TIME-related functions make it much easier to work with date and time attributes, allowing users to execute time-based analysis, like comparing the data over various time intervals.
  • Supports a variety of data transformation functions to cleanse, format, and typecast data to alter the structure of data to suit the requirements
  • SAQL enables you to create complex calculated fields using existing data fields by applying mathematical, logical, or string functions.
  • SAQL provides seamless integration with the Salesforce objects and CRM Analytics datasets.
  • SAQL queries can be used to design visuals like charts, graphs, and dashboards within the Salesforce CRM Analytics platform.

The rest of this article will focus on explaining the fundamentals of writing the SAQL queries, and delve into a few use cases where you can use SAQL to analyze the Salesforce data.

Basics of SAQL

Typical SAQL queries work like any other ETL tool: queries load the datasets, perform operations/transformations, and create an output data stream to be used in visualization. SAQL statements can run into multiple lines and are concluded with a semicolon. Every line of the query works on a named stream, which can serve as input for any subsequent statements in the same query.

The following SAQL query can be used to create a data stream to analyze the opportunities booked in the previous year by month.

SQL
 
1.	q = load "OpportunityLineItems";
2.	q = filter q by 'StageName' == "6 - Closed Won" and date('CloseDate_Year', 'CloseDate_Month', 'CloseDate_Day') in ["1 year ago".."1 year ago"];
3.	q = group q by ('CloseDate_Year', 'CloseDate_Month');
4.	q = foreach q generate q.'CloseDate_Year' as 'CloseDate_Year', q.'CloseDate_Month' as 'CloseDate_Month', sum(q.'ExpectedTotal__c') as 'Bookings';
5.	q = order q by ('CloseDate_Year' asc, 'CloseDate_Month' asc);
6.	q = limit q 2000;


Line Number Description

1

This statement loads the CRM analytics dataset named “OpportunityLineItems” into an input stream q.

2

The input stream q is filtered to look for the opportunities closed won in the previous year. This is similar to the WHERE clause in SQL.

3

The statement focuses on grouping the records by the close date year and month so that we can visualize this data by the months. This is similar to the GROUP BY clause in SQL.

4

Statement 4 is selecting the attributes we want to project from the input stream. Here the expected total is being summed up for each group.

5

Statement 5 is ordering the records by the close of the year and month so that we can create a line chart to visualize this by month.

6

The last statement in the code above focuses on restricting the stream to a limited number of rows. This is mainly used for debugging purposes.

Joining Multiple Data Streams

The SAQL cogroup function joins input data streams like Salesforce objects or CRM analytics datasets. The data sources being joined should have a related column to facilitate the join. cogroup also supports the execution of both INNER and OUTER joins. 

For example, if you had two datasets, with one containing sales data and another containing customer data, you could use cogroup to join them based on a common field like customer ID. The resultant data stream contains both fields from both tables.

Use Case

The following code block can be used for a data stream for NewPipeline and Bookings for the customers. The pipeline built and bookings are coming from two different streams. We can join these two streams by Account Name.

SQL
 
q = load "Pipeline_Metric";
q = filter q by 'Source' in ["NewPipeline"];
q = group q by 'AccountName';
q = foreach q generate q.'AccountName' as 'AccountName', sum(ExpectedTotal__c) as 'NewPipeline';

q1 = load "Bookings_Metric";
q1 = filter q1 by 'Source' in ["Bookings"];
q1 = group q1 by 'AccountName';
q1 = foreach q1 generate q1.'AccountName' as 'AccountName', sum(q1.ExpectedTotal__c) as 'Bookings';

q2 = cogroup q by 'AccountName', q1 by 'AccountName';
result = foreach q2 generate q.'AccountName' as 'AccountName', sum(q.'NewPipeline') as 'NewPipeline',sum(q1.'Bookings') as 'Bookings';


You can also use a left outer cogroup to join the right data table with the left. This will result in all the records from the left data stream and all the matching records from the right stream. Use the coalesce function to replace all the null values from the right stream with another value. In the example above, if you want to report all the accounts with or without bookings, you can use the query below.

SQL
 
q = load "Pipeline_Metric";
q = filter q by 'Source' in ["NewPipeline"];
q = group q by 'AccountName';
q = foreach q generate q.'AccountName' as 'AccountName', sum(ExpectedTotal__c) as 'NewPipeline';

q1 = load "Bookings_Metric";
q1 = filter q1 by 'Source' in ["Bookings"];
q1 = group q1 by 'AccountName';
q1 = foreach q1 generate q1.'AccountName' as 'AccountName', sum(q1.ExpectedTotal__c) as 'Bookings';

q2 = cogroup q by 'AccountName' left, q1 by 'AccountName';

result = foreach q2 generate q.'AccountName' as 'AccountName', sum(q.'NewPipeline') as 'NewPipeline', coalesce(sum(q1.'Bookings'), 0) as 'Bookings';


Top N Analysis Using Windowing

SAQL enables Top N analysis across value groups using the windowing functions within the input data stream. These functionalities are utilized for deriving the moving averages, cumulative totals, and rankings within the groups.

You can specify the set of records where you want to execute these calculations using the “over” keyword. SAQL allows you to specify an offset to identify the number of records before and after the selected row. Optionally you can choose to work on all the records within a partition. These records are called windows.

Once the set of records is identified for a window, you can apply an aggregation function to all the records within the defined window.

Optionally you can create partitions to group the records based on a set of fields and perform aggregate calculations for each partition independently.

Use Case

The following SAQL code can be used to prepare data for the percentage contribution of new pipelines for each customer to the total pipeline by the region and the ranking of these customers by the region.

SQL
 
q = load "Pipeline_Metric";
q = filter q by 'Source' in ["NewPipeline"];
q = group q by ('Region','AccountName');
q = foreach q generate q.'Region' as 'Region',q.'AccountName' as 'AccountName',  
((sum('ExpectedTotal__c')/sum(sum('ExpectedTotal__c'))
over ([..] partition by 'Region')) * 100) as 'PCT_PipelineContribution', rank() over ([..] partition by ('Region') order by sum('ExpectedTotal__c') desc ) as
'Rank';
q = filter q by 'Rank' <=5;


Data Aggregation: Grand Totals and Subtotals With SAQL

SAQL offers rollup and grouping functions to aggregate the data streams based on pre-defined groups. While the rollup construct is used with the group by statement, grouping is used as part of foreach statements while projecting the input data stream.

The rollup function aggregates the input data stream at various levels of hierarchy allowing you to create calculated fields on summarized datasets at higher levels of granularity. For example, in case you have datasets by the day, rollup can be used to aggregate the results by week, month, or year.

The grouping function is used to group data based on specific dimensions or fields in order to segment the data into meaningful subsets for analysis. For example, you might group sales data by product category or region to analyze performance within each group.

Use Case

Use the code below to prepare data for the total number of accounts and accounts engaged by the region and theater. Also, add the grand total to look at the global numbers and subtotals for both regions and theaters.

SQL
 
q = load "ABXLeadandOpportunities_Metric";
q = filter q by 'Source' == "ABX Opportunities" and 'CampaignType' == "Growth Sprints" and 'Territory_Level_01__c' is not null;
q = foreach q generate 'Territory_Level_01__c' as 'Territory_Level_01__c','Territory_Level_02__c' as 'Territory_Level_02__c','Territory_Level_03__c' as 'Territory_Level_03__c', q.'AccountName' as 'AccountName',q.'OId' as 'OId','MarketingActionedOppty' as 'MarketingActionedOppty','AccountActionedAcct' as 'AccountActionedAcct','ADRActionedOppty' as 'ADRActionedOppty','AccountActionedADRAcct' as 'AccountActionedADRAcct';
q = group q by rollup ('Territory_Level_01__c', 'Territory_Level_02__c');
q = foreach q generate case when grouping('Territory_Level_01__c') == 1 then "TOTAL" else 'Territory_Level_01__c' end as 'Level1', 
case when grouping('Territory_Level_02__c') == 1 then "LEVEL1 TOTAL" else 'Territory_Level_02__c' end as 'Level2',
unique('AccountName') as 'Total Accounts',unique('AccountActionedAcct') as 'Engaged',((unique('AccountActionedAcct')  / unique('AccountName'))) as '% of Engaged'; 
q = limit q 2000;


Filling the Missing Date Fields

You can use the fill() function to create a record for missing date, week, month, quarter, and year records in your dataset. This comes very handy when you want to show the result as 0 for these missing days/weeks/months instead of not showing them at all.

Use Case

The following SAQL code allows you to track the number of tasks for the sales agents by the days of the week. In case the agents are on PTO you want to show 0 tasks.

SQL
 
q = load "Tasks_Metric";
q = filter q by 'Source' == "Tasks";
q = filter q by date('MetricDate_Year', 'MetricDate_Month', 'MetricDate_Day') in [dateRange([2024,4,23], [2024,4,30])];
q = group q by ('MetricDate_Year', 'MetricDate_Month', 'MetricDate_Day');
q = foreach q generate q.'MetricDate_Year' as 'MetricDate_Year', q.'MetricDate_Month' as 'MetricDate_Month', q.'MetricDate_Day' as 'MetricDate_Day', unique(q.'Id') as 'Tasks';
q = order q by ('MetricDate_Year' asc, 'MetricDate_Month' asc, 'MetricDate_Day' asc);
q = limit q 2000;


The code above will be missing two days where there were no tasks created. You can use the code below to fill in the missing days.

SQL
 
q = load "Tasks_Metric";
q = filter q by 'Source' == "Tasks";
q = filter q by date('MetricDate_Year', 'MetricDate_Month', 'MetricDate_Day') in [dateRange([2024,4,23], [2024,4,30])];
q = group q by ('MetricDate_Year', 'MetricDate_Month', 'MetricDate_Day');
q = foreach q generate q.'MetricDate_Year' as 'MetricDate_Year', q.'MetricDate_Month' as 'MetricDate_Month', q.'MetricDate_Day' as 'MetricDate_Day', unique(q.'Id') as 'Tasks';
q = fill q by (dateCols=(MetricDate_Year, MetricDate_Month, MetricDate_Day, "Y-M-D"));
q = order q by ('MetricDate_Year' asc, 'MetricDate_Month' asc, 'MetricDate_Day' asc);
q = limit q 2000;


You can also specify the start date and end date to populate the missing records between these dates.

Conclusion

In the end, SAQL has proven itself as a powerful tool for the Salesforce developer community, empowering them to extract actionable business insights from the CRM datasets using capabilities like filtering, aggregation, windowing, time-analysis, blending, custom calculation, Salesforce integration, and performance optimization. 

In this article, we have explored various capabilities of this technology and focused on targeted use cases. As a next step, I would recommend continuing your learnings by exploring Salesforce documentation, building your data models using dataflow, and using SAQL capabilities to harness the true potential of Salesforce as a CRM.

Analytics Customer relationship management Data stream Query language sql

Opinions expressed by DZone contributors are their own.

Related

  • Accelerating Insights With Couchbase Columnar
  • Transforming Data Analytics by Combining SQL and ML
  • How Doris Connects to Multiple Databases for Federated Analytics and Data Migration
  • Efficient Long-Term Trend Analysis in Presto Using Datelists

Partner Resources

×

Comments
Oops! Something Went Wrong

The likes didn't load as expected. Please refresh the page and try again.

ABOUT US

  • About DZone
  • Support and feedback
  • Community research
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Core Program
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 3343 Perimeter Hill Drive
  • Suite 100
  • Nashville, TN 37211
  • support@dzone.com

Let's be friends:

Likes
There are no likes...yet! 👀
Be the first to like this post!
It looks like you're not logged in.
Sign in to see who liked this post!