← Back to Blog

Let's Code: A Cleaner Way to Move Data Between OneStream Relational Tables

Eric Padua·
For many custom planning modules in OneStream, working within relational tables is always going to be a necessary task. These tasks are varied but often include:
  • Inserting data into relational tables from flat files
  • Writing custom calculations using T-SQL
  • Using relational tables to view things in dashboard components
  • Storing settings in relational tables for specific planning architecture
  • The list goes on...
Many developers have different ways of using SQL to achieve their goals in OneStream. Some use pure T-SQL to make calculations. Some use functions specific to BRApi usage that OneStream has created and work around those.
In this blog, I will go over three methods of getting data from one relational table to another, with the last being what I prefer now.

The Scenario

We have a planning module where users can enter data, and that data flows into a table that we will call XFC_Custom.
This custom table has columns in common that will eventually need to be pushed into a marketplace solution relational table for processing by that solution (think People Planning, Thing Planning, etc.).
The reason why this type of design exists is usually due to the client having a very specific way of looking at their data as well as the way they process their data. Directly entering this data into the solution may not always be the best approach, which leads us to augment the marketplace solutions with entirely new functionality.

Usual Ways to Push the Data

There are a couple of common ways developers handle this data push.

BRApi Functions

One method is using BRApi.Database.InsertOrUpdateRow. This requires some functions or logic to create dictionaries of rows to provide to the function. You might loop through an entire collection and update rows one by one. This can all be on the same connection and will run fine for small data sets (perhaps less than 100,000 rows).

One-by-One Inserts/Updates/Deletes

Another approach, often seen with a Dynamic Grid or SQL Table Editor, involves playing around with EditedDataRows in the args object. The specific objects depend on whether you are using a Dynamic Grid or SQL Table Editor.
This requires multiple SQL statements that work off of the status of the row's InsertUpdateOrDelete flag.
Both of these methods are things I have done for a long time. But, I have recently found a really effective way to do this with T-SQL's Merge functionality.

The Merge Method

This method involves one single SQL statement to handle updates, deletes, and inserts, complete with custom logic enabled by SQL conditionality.
After the SQL has been written, all you need to do is call BRApi.Database.ExecuteAction.

Here's an example:

C#MERGE SQL Template
1using System;
2using System.Collections.Generic;
3using System.Text;
4using OneStream.Finance.Engine;
5using OneStream.Shared.Common;
6using OneStream.Shared.Database;
7using OneStream.Shared.Engine;
8using OneStream.Shared.Wcf;
9
10namespace Workspace.__WsNamespacePrefix.__WsAssemblyName
11{
12  /// <summary>
13  /// Template for implementing SQL MERGE operations to synchronize data between custom tables.
14  /// Uses a single MERGE statement for optimal performance (INSERT, UPDATE, DELETE in one operation).
15  /// </summary>
16  public class MergeHelper
17  {
18      private StringBuilder debugString;
19
20      /// <summary>
21      /// Performs a SQL MERGE operation to synchronize data from source table to target table.
22      /// TODO: Customize table names, column mappings, and business logic.
23      /// </summary>
24      public void PushToRelationalTable(SessionInfo si)
25      {
26          try
27          {
28              this.debugString = new StringBuilder();
29              this.debugString.AppendLine("MergeHelper: Starting process");
30
31              // Get workflow context
32              string wfScenario = BRApi.Finance.Members.GetMemberName(si, (int)DimTypeId.Scenario, si.WorkflowClusterPk.ScenarioKey);
33              string wfTime = BRApi.Finance.Members.GetMemberName(si, (int)DimTypeId.Time, si.WorkflowClusterPk.TimeKey);
34              this.debugString.AppendLine($"MergeHelper: Workflow Scenario={wfScenario}, Time={wfTime}");
35
36              // Setup query parameters
37              List<DbParamInfo> dbParams = new List<DbParamInfo>
38              {
39                  new DbParamInfo("WFScenario", wfScenario),
40                  new DbParamInfo("WFTime", wfTime)
41              };
42
43              // TODO: Update SOURCE_TABLE, TARGET_TABLE, and all column mappings
44              // SQL MERGE statement template for optimal performance
45              // Structure: MERGE INTO [target] USING [source] ON [join_condition]
46              //   WHEN MATCHED AND [delete_condition] THEN DELETE
47              //   WHEN MATCHED AND [update_condition] THEN UPDATE
48              //   WHEN NOT MATCHED AND [insert_condition] THEN INSERT
49              string mergeSql = @"
50                  MERGE INTO TargetTable AS target
51                  USING (
52                      SELECT
53                          SourceColumn1,
54                          SourceColumn2,
55                          SourceColumn3,
56                          WFProfile,
57                          WFScenario,
58                          WFTime,
59                          StatusColumn1,
60                          StatusColumn2
61                      FROM SourceTable
62                      WHERE WFScenario = @WFScenario
63                        AND WFTime = @WFTime
64                  ) AS source
65                  ON target.TargetKeyColumn = source.SourceColumn1
66                  AND target.WFProfileName = source.WFProfile
67                  AND target.WFScenarioName = source.WFScenario
68                  AND target.WFTimeName = source.WFTime
69                  -- TODO: Customize delete condition (e.g., when approval is revoked)
70                  WHEN MATCHED AND (source.StatusColumn1 = 0 OR source.StatusColumn2 = 0) THEN
71                      DELETE
72                  -- TODO: Customize update condition and column mappings
73                  WHEN MATCHED AND source.StatusColumn1 = 1 AND source.StatusColumn2 = 1 THEN
74                      UPDATE SET
75                          target.TargetColumn1 = source.SourceColumn2,
76                          target.TargetColumn2 = source.SourceColumn3,
77                          target.UpdatedDate = GETDATE()
78                  -- TODO: Customize insert condition and column mappings
79                  WHEN NOT MATCHED AND source.StatusColumn1 = 1 AND source.StatusColumn2 = 1 THEN
80                      INSERT (
81                          TargetKeyColumn,
82                          WFProfileName,
83                          WFScenarioName,
84                          WFTimeName,
85                          TargetColumn1,
86                          TargetColumn2,
87                          TargetColumn3,
88                          CreatedDate
89                      )
90                      VALUES (
91                          source.SourceColumn1,
92                          source.WFProfile,
93                          source.WFScenario,
94                          source.WFTime,
95                          source.SourceColumn2,
96                          source.SourceColumn3,
97                          'DefaultValue',
98                          GETDATE()
99                      );";
100
101              this.debugString.AppendLine("MergeHelper: Executing MERGE statement");
102
103              using (DbConnInfo dbConn = BRApi.Database.CreateApplicationDbConnInfo(si))
104              {
105                  long rowsAffected = BRApi.Database.ExecuteActionQuery(dbConn, mergeSql, dbParams, true, true);
106                  this.debugString.AppendLine($"MergeHelper: {rowsAffected} rows affected");
107              }
108
109              this.debugString.AppendLine("MergeHelper: Process completed successfully");
110          }
111          catch (Exception ex)
112          {
113              throw new XFException(this.debugString.ToString(), ex);
114          }
115      }
116  }
117}
Essentially this is how it works:

The Tables

We have our source table (the table we want to push data from) and our target table that we are applying the merge statement to (e.g., People Planning, Thing Planning, Capital Planning).

The Logic

We have three separate sections in the Merge statement.
  1. One for deletes: We have some kind of conditionality to delete when certain values exist for the row.
  2. One for updates: We target columns that we want to update given our mapping of keys that identify singular rows.
  3. One for inserts: We insert completely new rows with default values automatically casted.
sqlMERGE Statement Structure
1MERGE INTO TargetTable AS target
2USING SourceTable AS source
3ON target.KeyColumn = source.KeyColumn
4
5-- Delete rows that no longer exist in source
6WHEN MATCHED AND source.IsDeleted = 1 THEN
7  DELETE
8
9-- Update existing rows
10WHEN MATCHED THEN
11  UPDATE SET
12      target.Column1 = source.Column1,
13      target.Column2 = source.Column2
14
15-- Insert new rows
16WHEN NOT MATCHED BY TARGET THEN
17  INSERT (KeyColumn, Column1, Column2)
18  VALUES (source.KeyColumn, source.Column1, source.Column2);
The beauty of this method is that all transactions are done on the SQL server backend with one single statement and one BRApi call.
This avoids having to create a bunch of helper functions to use BRApi.Database.InsertOrUpdateRow while also having to treat delete logic separately through some kind of conditional statement. The Merge T-SQL statement takes care of everything and makes the entire rule extremely readable.

Cleaner and Easier

Doing work like this is very easy to do when using this T-SQL Merge functionality.
Coding this out normally probably would take twice as long. In fact, that is basically how long it took me to write out a BRApi.Database.InsertOrUpdateRow process until I switched over to the merge statement.
In general, I prefer methods that are simpler, more readable, and honestly, this probably is far more performant since all the work is done on SQL Server.
But anyway, give this a try if you find yourself doing this type of work. I promise you, it is a much better experience.