Building meaningful charts using the QuickBooks Reporting API
The QuickBooks API provides a handful of endpoints that enable us to assemble their most common reports, such as Balance Sheets or Profit & Loss. Unfortunately, with this power comes a level of abstraction that makes it quite tough to wrangle the data into a meaningful chart or report.
In this post, I will demonstrate what the structure of the endpoints means, why it is actually excellent (for us nerds), and the SQL required to coerce it into something that humans (not nerds) can understand.
The structure
Please note: In this tutorial I’ll be using the Profit & Loss report, but the same concepts apply to all of the report types
When you think about your Profit & Loss report, you probably imagine it visually - a list of items down the left hand side, with corresponding values for the month in a separate column. Something, like this (taken from the QuickBooks dashboard):
But that’s not how the API provides the data. Instead, we get a nested “JSON” payload, similar to this:
{ "CompanyInfo": { "CompanyName": "Sandbox Company_US_1", "LegalName": "Sandbox Company_US_1", "CompanyAddr": { "Id": "1", "Line1": "123 Sierra Way", "City": "San Pablo", "Country": "USA", "CountrySubDivisionCode": "CA", "PostalCode": "87999" }, "CustomerCommunicationAddr": { "Id": "1", "Line1": "123 Sierra Way", "City": "San Pablo", "Country": "USA", "CountrySubDivisionCode": "CA", "PostalCode": "87999" }, "LegalAddr": { "Id": "1", "Line1": "123 Sierra Way", "City": "San Pablo", "Country": "USA", "CountrySubDivisionCode": "CA", "PostalCode": "87999" }, "CustomerCommunicationEmailAddr": { "Address": "ops@blackballsoftware.com" }, "PrimaryPhone": { "FreeFormNumber": "4081234567" }, "CompanyStartDate": "2024-03-06", "FiscalYearStartMonth": "January", "Country": "US", "Email": { "Address": "ops@blackballsoftware.com" }, "WebAddr": {}, "SupportedLanguages": "en", "DefaultTimeZone": "America/Los_Angeles", "NameValue": [ { "Name": "NeoEnabled", "Value": "true" }, { "Name": "NonTracking", "Value": "false" }, { "Name": "IsQbdtMigrated", "Value": "false" }, { "Name": "CompanyType", "Value": "Other" }, { "Name": "SubscriptionStatus", "Value": "SUBSCRIBED" }, { "Name": "OfferingSku", "Value": "QuickBooks Online Plus" }, { "Name": "PayrollFeature", "Value": "false" }, { "Name": "AccountantFeature", "Value": "false" }, { "Name": "QBOIndustryType", "Value": "Landscaping Services" }, { "Name": "ItemCategoriesFeature", "Value": "false" }, { "Name": "AssignedTime", "Value": "2024-03-06T17:56:43-08:00" } ], "domain": "QBO", "sparse": false, "Id": "1", "SyncToken": "8", "MetaData": { "CreateTime": "2024-03-06T16:56:43-08:00", "LastUpdatedTime": "2025-08-17T19:04:51-07:00" } }, "time": "2025-10-06T13:40:20.816-07:00" }
Wow, that is quite a lot to take in. You’ll probably recognize a few things like your product/service names and the totals. But it’s not immediately obvious how to coerce this into a tabular format.
Step 1/2 - Storing the data in a relational database
The format of the API response is called JSON
- and it’s a common format for APIs to return data in. It’s possible to query and manipulate JSON data, but your reporting tool probably doesn’t have this facility and besides, you have to be pretty nerdy.
So, our first step is to convert this JSON into a relational-database format. Reporting tools love relational data. At SyncHub, we came up with the data model below.
If you have a SyncHub subscription connected to QuickBooks, these tables will automatically appear in your database, and the data will be kept up to date automatically too.
As you can see, we retained the abstract nature of the data (rows & cells), and augmented with a few other quality-of-life fields such as typed columns and foreign keys.
At this point you’re probably thinking that we’re just replacing one confusing solution for another. And from a human’s point of view, you’re right. But from your reporting tool’s point of view there is one major new benefit - SQL queries.
Step 2/2 - Querying your QuickBooks reports using SQL queries
Now, this data does not lend itself to your run-of-the-mill SELECT queries. No, for this you need something more powerful - the PIVOT query. I won’t go into details about exactly how the PIVOT query works - you can search Google or ask ChatGPT about that. But in short, it lets us use the imported column values (i.e. item names and totals) as columns headings. I’ll cut straight to the code, pasted below.
Note that I’m using MS SQL, which is the default database provided free as part of your SyncHub subscription. If you’re using an alternative such as Postgres, you’ll have to tweak accordingly.
declare @ReportID nvarchar(500) declare @ColumnNames table (ColumnName nvarchar(500), CellNumber int) -- Get the most recent report select top 1 @ReportID = RemoteID from [CONNECTIONS.quickbooks].[ProfitAndLossReportByFiscalYear] order by [StartPeriod] desc -- Some reports do not have a column name in the initial column, as it is implicit. Our pivot below however -- needs a column name, so we'll create one here declare @DefaultColumnName nvarchar(100) = 'Item' -- Get our column names. Most typically these are the months that your report covers (e.g. 'Feb 1979') -- Note that this logic is duplicated in the dynamic query below - REF_COLUMN_NAMES insert into @ColumnNames select case when isnull(col.ColTitle, '') = '' then 'Item' else col.ColTitle end as ColumnName, col.ColumnNumber from [CONNECTIONS.quickbooks].ProfitAndLossReportByFiscalYearColumn col where col.ProfitAndLossReportByFiscalYearRemoteID = @ReportID and col.IsDeleted = 0 order by col.ColumnNumber -- Check that we have data if not exists (select 1 from @ColumnNames) begin raiserror('No data is present - please make sure your data is synced and up to date', 16, 1) return end -- Pull the column names into a variable so we can dynamically inject into our query below declare @Columns nvarchar(max) = '' select @Columns += QUOTENAME(ColumnName) + ',' from @ColumnNames order by CellNumber set @Columns = left(@Columns, len(@Columns) - 1) -- Dynamically execute our SQL so that we may inject these column names using string concatenation declare @SQL nvarchar(max) = N' ;with allcells as ( select rr.Type as RowType, rr.RowNumber, rr.RemoteID as RowRemoteID, columnNames.ColumnName, -- Optional, we indent the first column because this is the description and reports usually group items so this gives a nice visual representation case when cc.ColumnNumber = 1 then space(ReportNestingLevel) + cc.[Value] else cc.[Value] end as CellValue from [CONNECTIONS.quickbooks].ProfitAndLossReportByFiscalYearRow rr left join [CONNECTIONS.quickbooks].ProfitAndLossReportByFiscalYearRowColumnValue cc on rr.RemoteID = cc.RowRemoteID -- Cross reference to find the corresponding column name, as per the same logic in REF_COLUMN_NAMES above -- We need this so that we can JOIN later left join ( select case when isnull(cc.ColTitle, '''') = '''' then @DefaultColumnName else cc.ColTitle end as ColumnName, cc.ColumnNumber from [CONNECTIONS.quickbooks].ProfitAndLossReportByFiscalYearColumn cc where cc.ProfitAndLossReportByFiscalYearRemoteID = @ReportID and cc.IsDeleted = 0 ) columnNames on cc.ColumnNumber = columnNames.ColumnNumber where rr.ReportRemoteID = @ReportID and rr.IsDeleted = 0 ), cellsByColumn as ( -- Pivot the cells so that they are based on the column name select * from allCells pivot ( -- Pivot requires that we use some kind of aggregation, so we will just use min() here - noting that there is only one value anyway min(CellValue) for ColumnName in (' + @Columns + ') ) as pt ) -- Now just format the final report, using all the (dynically generated) month names as columns select ' + @Columns + ' from cellsByColumn -- The API includes rows which appear duplicated but are in fact just placeholder rows to indicate sections. From what we can tell, you only need Data or NULL, but -- if you are missing records, try removing this filter where (RowType = ''Data'' or RowType is null) order by RowNumber ' -- The work above merely concatenated a long SQL string. Here, we execute it to actually query the database, using sp_executeSQL exec sp_executeSQL @SQL, N'@ReportID nvarchar(500), @DefaultColumnName nvarchar(100)', @ReportID, @DefaultColumnName
If you execute this against your SyncHub data store, you’ll get something along the lines of the results below:
So, how can you use this?
I won’t lie to you - that is a gnarly query. But, that is the nature of BI sometimes, and think how impressed everybody in the office will be! So let’s move on to the final step - how exactly can you use this query in your reports?
Using your own reporting tool
Unfortunately, we can’t provide guidance for every reporting tool, but (assuming your tool supports SQL queries, which it almost certainly does), what you can essentially do is copy/paste this into your editor and tweak accordingly.
Note: you must replace the schema placeholder
[CONNECTIONS.quickbooks]
with whatever the schema name of your SyncHub data store is.
There is also the chance that your reporting tool won’t let you execute dynamically-assembled strings. In this case, you’ll have to either:
hard-code the column names (changing every month, as required)
use our Insights option below, then query the resulting table from your reporting tool
Using SyncHub Insights
For a much easier solution, you can use this query in our Insights platform. Insights will periodically execute the query, and store the results in a new database table. From there, you can either:
query your new database table from your reporting tool, using a simple
select * from…
query; oruse Insights itself to render and share your reports
To conclude
QuickBooks is great. Their reporting API is great. SyncHub is great.
Happy reporting everybody!