Posted on Feb 09, 2015

It's been over a year and a half since my last blog post?  The time, it flies.  There were a number of projects that kept me super busy during that time, and only recently did things slow down.  It's a "good problem to have" as they say, but it's still a problem. :S

I ran into an issue recently with a Visual Force page I was generating for a project and thought I'd post about what I found, in case it can benefit anybody else. But first...

Governor Limits

For those who haven't used Salesforce yet, one of the first features of the platform you'll be frustrated by are Governor Limits.  Salesforce is a multitenant environment (multiple clients running their Salesforce instances on the same server hardware) and so they have created limits in order to make sure the clients don't write code or run operations that use up too many of those shared resources and degrade performance for their "neighbours" on the server.

Governor Limits can be frustrating if you've never had to worry about performance or optimizing use of resources.  If you needed to access the database through multiple queries, you just did it, even if you could have collapsed some of those queries into one or two database calls.  Governor Limits will keep track of how many rows your queries are returning, how many queries you're running, how much memory you're taking up and how long your processes are running, and if you're not doing things in an optimal way, Salesforce will stop and throw an error once one of the limits is trespassed.

Governor Limits are no small source of headaches, but they make you a better programmer. They force you to really think about your design and whether you've optimized your code.

Using Visual Force to generate an Excel File over 1000 records

I had a requirement to build an Excel file which was generated by data within the system.  The data had to be massaged a little bit to get it into a certain format, it would display rows of data from the database and the headers had to have certain labels because the Excel file was going to be uploaded into another system which would parse it and extract the data. Using a VF page to generate a PDF or Excel file is pretty easy, so this is the route we went.

The problem is that VF has it's own limits which we didn't take into account, the number one issue being that VF "collections" (a list of items to be used in a VF "repeat" or "datatable") is limited to 1000 records, and the data needed for the report had recently gone over that limit.

I tried a few different solutions to the problem:

  • Generate a string of HTML instead of VF components: Instead of using the built-in VF datatable component and topping out at 1000 records, I tried just compiling a long string of rendered HTML.  Results? Disasterous.  My initial stab at it went way over the CPU running time govenor limit of 10 seconds.

    I initially wrote my code like this inside of a loop:

    myString += "<tr>"; myString += "<td>" + record.someField + "</td>"; myString += "<td>" + record.someOtherField + "</td>";

    This was not at all performant.  A much better form would have been:

    myString += "<tr>" + "<td>" + record.someField + "</td>" + "<td>" + record.someField + "</td>" // etc.

    Assigning all the content to the string in one go was more efficient than concatenating each line individually. (Probably a good optimization rule of thumb for any programming language)  It wasn't enough though, the CPU usage was under the limit but would top out soon enough.

    Salesforce must have done optimizations on the VF components, or at least excluded their processing/rendering time from counting towards the CPU limits, as using them to render the page used hardly any CPU time at all.
  • Use the <apex:page> attribute "readonly": I have to admit this was a new one for me.  Setting this attribute in your page VF tag beefs up some of the limits applied to VF pages at the expense of not being able to use update, insert & delete DML (Database Manipulation Language) calls in your parent VF controller.  For my purposes this was ideal since I'm just generating a report.

    This worked as a quick fix, but won't last forever.  The limit is only boosted up to 10,000 records, so we'll run out of headroom there eventually as well.

    Note: you could do a work around for this by creating multiple collection variables, putting 10,000 records in each, and having multiple "datatable" or "repeat" tags on your page to spit out all the data.  This wouldn't work indefinitely though.  You'd have to keep adding more collection variables as the size of the database got bigger, and eventually you'd run into a CPU time governor limit (10 seconds) or a Heap Size govenor limit. (6 mb)
  • Using Salesforce Reports: this is probably the route we should have gone in the first place.  You can build a report in Salesforce and while the report won't show every single record in your database on the report page, you can get all the data by exporting the report.  Currently there are no limits applied to this export process, so your database could have 100,000's of records and you could still export them to an Excel spreadsheet.

    But what about the customized headers? We could handle this with Custom Report Types.  Part of creating a new report type is setting up a Template for that report type, which is where you set what fields can be used.  One of the abilities of the Template is allowing you to rename fields.  So we just create a new report type, drag in the fields we're going to use, and re-label them so that we get the headers we want in the report.

    But what about massaging the data?  That will require adding some formula fields, to copy the original value and make transformations or string replacements as the case may be.  We'd keep them off the page layout, but use them in the Template for our custom report type.
  • Data Loader: Use the Salesforce Data Loader java tool or one of the many other alternatives like Jitterbit or This would be the last possible option, as the results would be a CSV file which would need to manually be placed into an Excel file (hopefully without having Excel screwing up any of the data by auto-formatting some of the fields like dates and decimals) and have the header replaced.  The tools have graphical interfaces, but they aren't even close to being by a non-technical user without proper training,

Anyways, I hope you can benefit from my learnings on this topic, and hopefully this will get me back into posting again. :)