Author: Stephanie Damgaard

  • Lane Four Named One of Canada’s Top Growing Companies for a Second Year

    Lane Four Named One of Canada’s Top Growing Companies for a Second Year

    We are pleased to announce that Lane Four has placed #222 on the 2020 Report on Business ranking of Canada’s Top Growing Companies. Placement on the list is determined by three-year revenue growth. Lane Four proudly earned its spot with three-year growth of 183%.

    Since ranking #195 on last year’s list, Lane Four has continued to build its impressive list of clients while also expanding its own consultant, developer, and go-to-market teams. Whether through its Salesforce consulting services or its lead-to-account matching and routing tools, Lane Four is known for delivering solutions that allow high-growth SaaS companies to optimize critical sales operations processes at scale. With the ongoing disruption of COVID-19, Lane Four is helping clients navigate an environment in which automation is even more critical for maintaining revenue growth.

    FEATURED RESOURCE

    Salesforce for
    SaaS Blueprint

    On the product side, Lane Four is growing rapidly within the emerging lead-to-account matching and routing software category. Recently, Lane Four was named as one of the top four vendors in the category by TOPO.

    Lane Four’s founder, Andrew Sinclair, is proud to count some of Silicon Valley’s most forward-thinking companies as clients and partners: companies like Carta, Lyra, and Outreach.io. “Despite the setbacks that many of our clients are experiencing right now, it’s an exciting time to be growing in the SaaS space. We love working with startups, and we’re excited about the milestones our clients work so hard to achieve. Whether it’s an IPO, a high profile acquisition, or a round of funding, to be a small part of helping our clients succeed at major growth is what keeps us motivated.”

    Since the beginning, Lane Four has also been committed to helping local non-profit organizations leverage Salesforce for fundraising and program management—including work with the Jane Goodall Institute, the Institute for Canadian Citizenship, and the Walrus Foundation. As the company grows, this commitment remains an important part of Lane Four’s spirit.

    Celebrating Entrepreneurial Achievement in Canada

    Launched in 2019, the Canada’s Top Growing Companies editorial ranking aims to celebrate entrepreneurial achievement in Canada by identifying and amplifying the success of growth-minded, independent businesses in Canada. It is a voluntary program; companies had to complete an in-depth application process in order to qualify. In total, 400 companies earned a spot on this year’s ranking.

    The full list of 2020 winners, and accompanying editorial coverage, is published in the October issue of Report on Business magazine and online.

    “Any business leader seeking inspiration should look no further than the 400 businesses on this year’s Report on Business ranking of Canada’s Top Growing Companies,” says Phillip Crawley, Publisher and CEO of The Globe and Mail. “Their growth helps to make Canada a better place, and we are proud to bring their stories to our readers.”

    About The Globe and Mail

    The Globe and Mail is Canada’s foremost news media company, leading the national discussion and causing policy change through brave and independent journalism since 1844. With award-winning coverage of business, politics and national affairs, The Globe and Mail newspaper reaches 5.9 million readers every week in print or digital formats, and Report on Business magazine reaches 2.1 million readers in print and digital every issue.

    About Lane Four

    Lane Four is a Salesforce.com partner and boutique Salesforce consulting firm. Lane Four also offers Lane Four 2.0, a suite of account-based lead management tools on the Salesforce AppExchange, which allows customers to rapidly scale their sales and marketing processes.

  • A Chat with SingleStore’s Head of Revenue Operations

    A Chat with SingleStore’s Head of Revenue Operations

    Ahmed Chowdhury is an ops veteran. He got his start in sales operations at a leading martech company in 2010, when the field of sales ops was relatively new. With a background in finance, economics, and statistical analysis, Ahmed was a natural fit, and hasn’t looked back since. Over the past decade, he’s assumed leadership roles at an impressive list of tech companies.

    Today, Ahmed leads revenue operations at SingleStore (formerly MemSQL), a fast-growing data platform for operational analytics, ML and AI. We talked to Ahmed to get his perspective on all things revenue operations. What follows are some key insights from our chat.

    On the Benefits of Revenue Ops

    When he started at MemSQL in early 2019, Ahmed headed up a traditional sales ops team. But he quickly began championing a move to a revenue-focused model to promote better resource allocation, align revenue metrics, and eliminate departmental bias around data and architecture.

    The sooner you align your functions under revenue ops, the better.

    While it was initially tough to get buy-in for such a massive structural shift, Ahmed was able to de-silo the company’s operations teams and adopt a revenue ops model earlier this year. Now, marketing ops, sales ops, and customer ops have merged, creating a streamlined operational team that works towards shared goals.

    Today, Ahmed’s revenue operations team is divided into four roles. In addition to his role as head of revenue operations, one person handles data insights and data operations; one handles sales insights and enablement; and a fourth handles go-to-market ops, field ops, and customer ops.

    So far, this restructure has allowed him to better utilize his team’s talents, eliminate duplication of work, and, most importantly, focus his team on “one view of the truth” when it comes to metrics, goals, and reporting.

    “Companies should start off this way,” says Ahmed, “The sooner you align your functions under revenue ops, the better.”

    According to Ahmed, working under a revenue ops model ultimately saves money, because it allows your team to be both leaner and more efficient. You can do better work with fewer employees because it’s much easier to identify and execute on the priorities that will generate revenue.

    On Hiring for Revenue Ops

    Getting a lean revenue ops team right requires strategic hiring practices. Ahmed has had luck hiring for technical, analytical, and problem-solving skills, which he believes are the most important and the hardest to teach on the job. As a result of this philosophy, he has built a diverse team of highly qualified and multi-talented team members with backgrounds in math, economics, engineering, and sales.

    “Over time, operations has evolved into a data science role. Hiring people with strong technical backgrounds who can code has been critical for our team. One of my team members recently changed the entire code base of our BI tool. This wouldn’t have been a requirement in the past, but today, this is the reality,” says Ahmed.

    Given the unique requirements of an ops team, hiring people with multiple talents has been important, too. For example, one of his team members has a background in both coding and sales—he can code one day and run a sales enablement session the next. This unique combination may be tough to find, but having this kind of flexibility on your team can be incredibly beneficial.

    Get Our RevOps
    Newsletter

    On Building the Right Tech Stack

    According to Ahmed, there are seven “must-have” tools at the heart of MemSQL’s operations. Salesforce CRM, which is central to the stack; DiscoverOrg and ZoomInfo, a winning combination for account and contact data; Domo, a BI tool chosen for its ability to engage business leaders within its forecast app; ClosePlan, which does a great job of relationship mapping and makes it painless to review deals; Salesforce CPQ which streamlines quote and contract creation for the sales team; and finally, Slack, which facilitates real-time communication around priority actions, metrics, and more.

    Ahmed also relies heavily on People.ai and Atrium which together provide robust sales activity reporting. The former measures the activities that result in penetrating key accounts, while the latter is highly effective at activity-based reporting.

    On Revenue Ops Metrics

    Under the new operational structure, Ahmed’s team looks at metrics in four key areas: funnel analytics; sales forecasting & pipeline; sales effectiveness; and renewals.

    The team spends a lot of time in particular on pipeline metrics, looking at the data from many angles. How diverse is the pipeline in terms of top tier vs. lower tier companies? What is the pipeline distribution by stage? By use cases? How much pipeline is being created week over week? How is pipeline distributed by month? These are just a few of the numbers they routinely scrutinize.

    Ahmed also highlighted renewal metrics as a hugely important part of measuring revenue ops. His team is proactive about the renewals process, and knows at any given moment the renewal rate, churn rate, and how much revenue is coming up for renewal. This vigilance allows the team to focus on minimizing churn, with a robust communication plan that starts five months out from renewal.

    Ahmed’s Revenue Ops Tip: “Pipeline Generation Tuesdays”

    Ahmed sets aside every Tuesday for the sales team to review week-over-week pipeline creation reports and execute prospect outreach. While sales rep activity happens daily, setting aside one day a week to double down on prospect outreach has been key. After spending the week creating an outreach plan, reps first present their plans and then spend the full afternoon calling prospects. This day is also incentivized for reps, featuring regular contests tied back to compensation.

  • [Virtual Event] How to Use Flows in Salesforce

    [Virtual Event] How to Use Flows in Salesforce

    On Wednesday, August 26, Lane Four Founder Andrew Sinclair presented at the Toronto Administrators Trailblazer Community Group Meeting to discuss the best practices for using flows in Salesforce.

    Flows are often the best choice for doing automation in Salesforce. And with new features being added regularly, flows are fast becoming an efficient and powerful way to automate key business processes as an admin. In this presentation, Andrew Sinclair will discuss when to choose flows over other types of Salesforce automation. He’ll also discuss how new flow features can revolutionize your approach to automation, including how they can protect your system against CPU timeout errors.

    Learn More About Salesforce Automation

    For a look at the session content, check out some of our past posts on selecting the right automation in Salesforce:

     

    Need a Salesforce
    Expert?

  • 2 New Salesforce Flow Features That Address CPU Timeout Errors

    2 New Salesforce Flow Features That Address CPU Timeout Errors

    If you’ve read our previous post on using flows in Salesforce, you’ll already know that we love them. Salesforce is constantly adding new features and making improvements to flows. As a result, we’re advising admins more and more to choose flows over other automation options. To our delight, Salesforce’s Winter ’20 and Spring ’20 releases have given us two more reasons to be excited about them. In particular, there are new ways to avoid CPU timeout errors caused by process builder.

    Flows Now Run Before Save (Spring ’20 Release)

    This great new feature has a major limitation—but it can be managed.

    If you struggle with CPU errors in Salesforce, take note of this new feature, which allows you to kick off a flow from a record change before save. This means that flows can now operate on data while it is en route to the database, and before other automation runs. (That is, everything except validation rules).

    There are many reasons that using flows is preferable to using process builder. Thanks to this new feature, many of the processes you would normally have to run in a process builder can now happen in flows. Using flows, these processes will be up to 10x faster and won’t cause recursion. Most importantly, using flows for functions like updating a date field will eliminate the strain that process builders can put on your CPU limit.

    Beware of one major limitation here: When your flow happens before save, you cannot filter it on change events. Instead, you will need to indicate within your flow that it should take action only when a specific change happens.

    For example, in the flow below, our client wanted to assign a record only when the contact status field value changed to a particular value. We configured the flow to fetch old contact values, and then used the previously met decision to compare the newest field value to a previous value to determine whether it had changed to meet the status in the condition. If we hadn’t configured this field value comparison, the flow would fire on every record with the contact status value in question, whether it was a change or not.

    Scheduled Flows (Winter ’20 Release)

    We’ve written at length about scheduling your automation, both declaratively and in Apex code. As we recently mentioned, one of the best ways to schedule your automation, made possible by changes in the Winter ’20 release, is to use scheduled flows.

    Scheduled flows are another good automation choice to help steer you away from process builders and the CPU timeout errors they can create. Scheduling flows is useful for tasks like sending an email alert based on a date field. For example, have a look at the flow below, which triggers an email alert and is scheduled to run daily. Unlike a process builder, which needs to be triggered on a record change, this scheduled flow scans the system at a specific time each day to determine whether action should be taken based on the conditions set.

    You’ll need to be aware of some limitations with scheduled flows: You can only schedule the flow to run once per day, and you are unable to control the number of records that are processed per batch. So pay attention to make sure you won’t run into volume-related errors, and stick to process builders when necessary.

     

    Need a Salesforce
    Expert?

  • How to Schedule Automation in Salesforce Using Apex Code

    How to Schedule Automation in Salesforce Using Apex Code

    In our current blog series on choosing the right type of Salesforce automation, we’ve explored the dos and don’ts of real-time automation in Salesforce, including workflow rules and process builder, flows, and Apex triggers. Then we talked about scheduling your automation declaritively using scheduled flows, process builders with Mass Action Scheduler, and DLRS.

    This week, we tackle why and how to schedule your automation using asynchronous Apex code.

    When Should I Schedule My Automation in Code?

    As we’ve established in previous posts, you should always schedule your automation unless there is a business justification for running it in real time. (For example, lead assignment or time-sensitive email alerts). While it is possible to schedule some automation declaratively, it will still be subject to limitations around volume and batch size, which can lead to errors. Scheduling your automation using asynchronous Apex code can avoid some of the pitfalls of scheduling it declaratively.

    What Do I Need to Know When Scheduling Automation in Code?

    Don’t DIY code-hack. (Ever.)

    For Apex triggers, we said to avoid copying and pasting code from the internet with no real understanding of how it works, even though you might get away with it in some simple use cases. But when it comes to coding scheduled functions, you should avoid DIY code in every case. Code-hacking can easily have catastrophic impacts on the greater coding environment and cause failures.

    As a cautionary tale: We once had a client go DIY to implement a coded function in a future call. Unbeknownst to them, the function was initiated every single time any record was edited in their org. This created an overload to the Apex flex queue, which completely froze their Salesforce account until the queue was cleared. Obviously, this is a situation to avoid.

    The truth is that declarative tools are getting so sophisticated that in most cases, you won’t even need code in the first place. But when you do need to use it, leave coding to the experts.

    Select the right type of scheduled function.

    There are 4 ways to run asynchronous Apex code, so you’ll need to pick the most suitable approach. Knowing which one to choose requires coding expertise: another reason we caution against going DIY.

    1. Scheduled Apex

    Use scheduled apex to ensure that your function only runs on the records you want it to, when you want it to. This is a good way to avoid volume issues, because it means your function is triggered by a specific schedule and not by user actions. Note: If you use the out-of-the-box Salesforce scheduler, you’ll be limited to 1 scheduled run per day. But in code, you can schedule jobs as frequently as once an hour. (If you need to run your job more frequently than once an hour, ask us about clever workarounds.)

    2. Queueable Apex

    This option is appropriate when you need to start a long-running operation and get an ID for it, pass complex types to a job, or chain jobs.

    3. Batch Apex

    Select this approach for long-running jobs with large data volumes that need to be performed in batches, or jobs that need larger query results than regular transactions allow.

    4. Future Methods

    Use Future Methods when you have a long-running method and need to prevent delaying an Apex transaction, are making callouts to external Web services, or to segregate DML operations and bypass the mixed save DML error.

    For a complete description and more information on options for asynchronous Apex, visit this page.

    Need a Salesforce
    Expert?

    Understand that an asynchronous function cannot result in another asynchronous function.

    You may run into one of these common errors: “Too many queueable jobs”, or “Future method cannot be called from a future.” These errors happen when one asynchronous function calls another asynchronous function.

    If your asynchronous function triggers another asynchronous function, the code will detect that it’s being called by an asynchronous function and run itself immediately instead of triggering another batch.

    For example, if a job that calculates an account score that runs nightly results in a future call to assign a lead, it will fail. This kind of failure is a safeguard against the possibility of triggering an endless loop within Salesforce (a function that would never stop running).

    To make sure you’re protecting your org against this possibility, add the following to your code to check the state that the code is being called in:

    if (System.isBatch || System.isFuture || System.isQueueable()) {

      // run regular code

    }

    else {

      Database.executeBatch(new MyBatch());

    }

  • How to Declaratively Schedule Automation in Salesforce

    How to Declaratively Schedule Automation in Salesforce

    We’ve explored the dos and don’ts of real-time automation in Salesforce, including workflow rules and process builder, flows, and Apex triggers.

    Now we’re switching gears to dive into scheduled automation, starting with scheduling declarative automation.

    When Should I Use Scheduled Automation in Salesforce?

    Over the past few weeks, we’ve established that you should only use real-time automation when absolutely necessary. Whenever it’s possible, you should schedule your automation.

    There are many reasons to use scheduled automation over real-time automation. But primarily, scheduling your automation is a surefire way to avoid hitting user-facing errors. Here are three options for scheduling your automation declaratively.

    Scheduled Flows

    With a steady stream of new features and improvements, flows are fast becoming the principle place to configure automation in Salesforce. Scheduled flows, a feature introduced in the Winter ’20 release, is one that we particularly love.

    You can schedule your flow by selecting filter conditions from the Start element. There is a limit on the number of scheduled flow executions that can happen in a 24-hour period in any org. Check the debug logs, which will tell you the number of records the flow runs on, to make sure your flow will not exceed the limit. For more on limits and a full description of scheduled flows, go here.

    Use a scheduled flow when you need to perform a scheduled action that you can’t do in process builder, such as when you’re doing a query, doing updates in loops, need a screen flow for the user to interact with, and more.

    While this is a great feature, it’s important to note a couple of major limitations to scheduled flows. You can only schedule the flow to run once per day, and you are unable to control the number of records that are processed per batch. These two limitations mean that your scheduled flow may still cause volume and limit errors. A litmus test for this is if you hit volume errors on mass edits or imports to an object (for example, in data loader) your flow will likely have issues doing an import to that object, too.

    Due to these limitations, there is still some automation that will need to happen via scheduled process builders instead of scheduled flows.

    Schedule Process Builders via Mass Action Scheduler

    Mass Action Scheduler is a community tool that allows you to declaratively schedule process builders and more. This is a good approach to scheduling functions with process builder that either can’t or shouldn’t be run in real time. Unlike with scheduled flows, Mass Action Scheduler will allow you to control your batch size, which can be critical for preventing errors (check out the documentation for a how-to). It’s also much easier to create filters in process builder than in flows.

    In addition to controlling for batch size, there are a few other use cases that require a tool like this to schedule process builders. Since process builder can only be triggered by the creation of a record or a record change, there are some actions that it cannot perform in real time and therefore must be scheduled. For example, triggering an action on a date relative to a date field, such as sending a renewal email alert 60 days before the Contract End Date on an Opportunity. Another example is triggering an action based on reaching a certain value in a formula field, such as an account score. Since a change in the formula is not considered a record changing event, this action must be scheduled.

    Declarative Lookup Rollup Summaries (DLRS)

    We’re big fans of this tool, which allows you to build complicated rollup fields quickly and easily without code. It also lets you schedule them. DLRS allows you to summarize information on a parent record from associated child records at the click of a button. For example, you could create a “Last SDR Activity Date” on your lead or contact, calculate product commission, or count the number of products in a work order.

    While this tool can run in real time, doing so can (you guessed it!) cause performance issues. So unless you need your rollup to happen in real time, go ahead and schedule it to roll up every hour, day, or week.

     

    Need a Salesforce
    Expert?

  • Using Apex Triggers in Salesforce

    Using Apex Triggers in Salesforce

    We recently kicked off our new series on choosing the right Salesforce automation. So far, we’ve covered workflow rules, process builder, and flows. In this post, we’re taking on Apex triggers. 

    When Should I Use Apex Triggers in Salesforce?

    • When working with high data volume. Triggers can perform faster in such situations: if you’re getting process builder errors due to volume, consider using a trigger.[br][br]
    • To control order of execution. If you have something in code and something in process builder and you need them to run in a specific order, use a trigger. [br][br]
    • When everything else is in code already. When you’re forced to build out a complex coded process, it’s a best practice to consolidate your automations in code. Don’t create a rogue process builder here, because your developer won’t know what it’s doing.[br][br]
    • If you need to fire before insert or on delete. (Note: this will change in a future release, which will have flows firing before inserts.)[br][br]
    • When you have really complicated criteria, because Apex triggers can execute this much faster. It is possible to cripple your system with a CPU timeout error if you have a process builder dealing with too many operators (for example, we’ve seen this happen with too many “title contains” operators).[br]

    What Do I Need to Know When Using Apex Triggers?

    Avoid DIY code-hacking.

    It can be tempting to copy/paste code from online sources into your Apex trigger, but this is not always advisable. If you’re doing so in a small Salesforce environment with a very simplistic use case, you can probably get away with it.  But dropping DIY code into a more complex environment as a non-developer is risky, and can result in undeployable automation (even if it works in Sandbox).

    Pay attention to trigger events.

    It’s important to understand the event in which your code will run. A trigger framework can help you to be intelligent about this and ensure that your code runs at the right time. For example, it can detect whether your code should run before insert, after insert, before update, after update, before delete, or after delete.

    Fire your trigger selectively.

    Remember that no matter what, triggers always fire. You need to ensure that your criteria is as selective as possible.  As always, avoid this issue by logically thinking through every possible outcome of your trigger and ensuring your criteria includes a field change event.

    Remember that triggers fire synchronously.

    Firing synchronously subjects Apex triggers to processing limits including CPU time query limits and row locking issues. Triggers should only be used if you must initiate a function in real time. (Stay tuned for an upcoming post where we explore asynchronous processes!)

    [vcex_divider_dots color=”#66d7d1″ margin_top=”10″ margin_bottom=”10″]

    What issues have you run into with Apex triggers? We’d love to hear from you!

    Next up in the series, we’ll be discussing scheduled automations. Follow us on Twitter or LinkedIn for the latest.

    Need a Salesforce
    Expert?

  • ABM Metrics 101: A Mini-Guide to Measuring Account-Based Marketing in Salesforce

    ABM Metrics 101: A Mini-Guide to Measuring Account-Based Marketing in Salesforce

    Account-based marketing (ABM) has entered a new era. Having produced undeniable results for so many organizations (better conversion rates, ROI, and growth), ABM is now mainstream. Today, good B2B marketing is account-based marketing. But even though we have the sophisticated tools and strategies needed to make account-based marketing function, measuring ABM still proves difficult.

    Account-based companies often have trouble getting the data required to understand what’s really going on with ABM. And without this data, it’s impossible to make strategic decisions based on the effectiveness of your programs. Luckily, with Salesforce at the center of your tool integrations and processes, it is possible to get all of the ABM metrics you need from the CRM you already have.

    In this mini-guide, we reveal key considerations for measuring ABM, and share common metrics organizations are using to measure ABM today.

    What’s Inside

    The mini-guide provides a glimpse into common ABM metrics that we’ve seen account-based organizations use. It also illuminates key considerations for tracking and measuring target accounts, intent, engagement, attribution, and reporting.

    Download the Mini-Guide Now