New Hierarchy Level Behavior in Planning Analytics (TM1) Set Editor


Leaf Element Level naming convention is really changing...

It's when I first started teaching the Performance Modeler (PM) class and showing students how to create a TI Process with the Guided Import that I noticed in PM that the leaf level was not Zero.  The highest consolidated level was Zero.  At that time I just thought it was a fluke that PM was designed differently, but as for the rest of TM1, they were going to stick with the numbering convention that Architect used.  Wrong!

You have to be a 10 to be number one.  Huh???

Fast forward and I'm now working with Planning Analytics Workspace and PAx.  Like many consultants, we learned TM1 using Architect.  In Architect, level Zero was synonymous with the leaf level element of a dimension and we all know that TM1 holds numeric data only at the leaf level.  We've gotten very accustom to using the terms "leaf", "n-level" and "Zero-level" interchangeably.  In fact, my first dynamic subset was created using the Filter by Level option and choosing "0" for the leaf elements and even naming the subset "Level_0".

Being Level Zero is tops!

Historically TM1 has labeled its hierarchy levels where level 0 is the leaf level and the highest number is the top level.

This behavior persists today in Architect and Perspectives, but with the new set editor released in the 2nd version of Planning Analytics, the behavior in the set editor now follows industry standard MDX practices where level 0 is the highest level and the highest number is the leaf level. In other words, the reverse of what it was previously.

End users should be educated in this change as to avoid confusion when working with the data.

This issue is best highlighted when editing a set. First let’s look at Architect. In this example the plan_time dimension is opened and all elements are shown.

If we filter by levels as shown below, the 0 level will be the leaf level.

And if we filter the level on 2, in this case, it will return the highest level.

Conversely in the Planning Analytics Set editor in Planning Analytics Workspace or Planning Analytics for Excel, the reverse filtering method is true and is in alignment with standard MDX queries. In this example, we are filtering on level000 which returns the top level as seen below.

And if we filter on level002, it returns the leaf level as seen below.

The set editor also provides an easy method to filter on the leaf level by providing an easy to use Leaf Level option in the filter.

Before we get into some solutions to simplify the end user experience, remember that if users are proficient in MDX syntax including TM1-specific MDX functions, they can always quickly build their expression-based subsets by editing the MDX in the Set Editor.

For example, the following standard MDX expression would return all leaf level elements in our example:


If you wanted to return the top level elements, you could use the following expression:


Again, where 0 is the highest level and 2 is the lowest level in the example.

Conversely, you could use TM1-specific MDX functions to create the old behavior in the set editor’s MDX expression by using the following syntax which returns the leaf level:

{TM1FILTERBYLEVEL( {TM1SUBSETALL( [plan_time] )}, 0)}

Or, the following to return the highest level in our example:

{TM1FILTERBYLEVEL( {TM1SUBSETALL( [plan_time] )}, 2)}

Where 0 is the leaf level and 2 is the top level.

Methods to Simplify the User Experience

Create Named Levels

One method is to give your hierarchy levels meaningful names for your end users.

To do so:

In Architect, ensure you have enabled Display Control Objects under the View menu.

Find and open the }HierarchyProperties cube and then from the Dimensions drop down list, select the dimension you would like to override the naming convention for.

In this case we will select plan_time and provide the following level names.

Save the View as Default and not Private.

Now the plan_time dimension’s hierarchy needs to be updated. To do so, we will run a TI process. Right-click Processes in the left pane and click Create New Process.

Click the Advanced tab and then the Prolog tab.

After the Generated Statements section, enter the following syntax where DimensionName is replaced with the dimension you want to update. RefreshMdxHierarchy('plan_time');

Save the Process with an appropriate name and then click on Run to execute the TI process.

Now in the Planning Analytics Set Editor, the level names will appear in a more meaningful way.

Dealing with Ragged or Unbalanced Hierarchies

In some cases, the hierarchies may present elements at the same level as your consolidated levels which you may not want to display in your results when working in the Set editor.

In our example, we can see this with the Description element being displayed when filtering on the Year level.

A simple way to make the user experience more focused is by creating appropriate subsets.

In this case you could create and save a Public subset for the dimension that only displays the Years. In this case a subset called All Years.

With this selection, users have a focused list of elements to work with which they can also expand and there is no need for the extra clicks involved in filtering.

Notice description is no longer present.

So the moral of this story is that you may as well get on board with the new naming convention and start naming your subsets according to what "Level Zero" means going forward and not keep it as it's legacy term.  This will keep the new subset editor consistent with the way Performance Modeler treats levels with the guided import process of creating TI Processes too.

Get more out of your Planning Analytics/TM1 solution faster

Need to build more analytics into your model, but don't have the time.  Lodestar Solutions offers our Small Project Service.  Chipping away at your "To Build" list with our Small Project Services will get you results faster and cheaper than making a big production creating the "Next Phase Major Build" that usually gets kicked down the line due to cost and time.  If you have additional questions about this service or require assistance immediately, please contact Lodestar at (813)-415-2910 or at

TM1 Websheets to Simplify, Document and Organize TI Process and Tasks


I’m all about easy. It is a phrase I say repeatedly to my clients that are new to IBM Cognos TM1 so that it helps them to focus in on what will be the simplest solution and not to recreate a cumbersome process in new software. This is really a guided example showing how to use TM1 websheets to organize TI Processes, be a tool to document TI Processes, and document TM1 tasks.

As a TM1 admin, you are pretty adept at toggling around cubes and understanding your data. Maybe you even built the whole model and it’s all second nature to you, so you skipped documenting procedures. Well… it all seemed second nature to you at the time, but a year has gone by and you need to prep for your upcoming budget cycle and now you are trying to remember what needs to be updated.

In any event, I want to explain how to simplify and manage tasks and documentation in TM1 with websheets. For this, I utilize Perspectives and the Applications folders of Server Explorer to streamline tasks. The Applications folders are great for grouping tasks with the use of links to documentation (ie Word documents), cube views set up for specific needs, and websheets that can take it all to the next level. By next level, I mean that you can create a websheet that provides data pulled from TM1, add-in Excel formulas to do calculations on that data (thus eliminating writing of additional rules in TM1), have an input area to update data points, include Action Buttons to run TI Processes, and insert notations to document the function. This will not only make your life easier, it will help you to remember how the heck you did something last year, allow you to delegate tasks to other team members and documents procedures to give you piece of mind knowing that you did not leave behind an unrewarding treasure hunt for your former co-workers to figure out how you did something after you won the lottery and left the company. You can feel free to relax and have that Pina Colada on a Tuesday while floating in the pool (note, only for lottery winners).

Here, I will run through where you take your cube that houses rates that are used in an annual budget cycle, from just being a cube for input into  streamlined TM1 Websheets that simplifies the procedure.

Step One: Create cube for rate input and save view specific for input in TM1 Websheets.

TM1 Websheets

What’s missing? Well, now you need to remember that your Seasonal_Factor that you must update is in the Rates cube. That’s not really evident when you first open Server Explorer.

Step Two: Use Applications folder area to organize tasks in TM1 Websheets

TM1 Websheets

Getting closer. You dragged your Seasonal_Factor_Input view up to your Admin folder so that you know it is part of your Admin functions.

What’s missing? Working with just this view, I don’t see the average of the last two years to figure out what is trending in my data. Also, I will either need to put a HOLD on the Total Year so that if I make any changes, I still balance to 100% for the year, else I need to use the Data Spreading at the Total Year level to ensure I am at 100%. This is more effort than I want and it can be prone to error of Total Year not equaling 100%.

TM1 Websheets

Step 3: Create a Websheet and put it in the Applications folder area in TM1 Websheets

Using Perspectives and my view I created earlier, I do a slice to Excel so that TM1 can put in most of the formulas that I need into my websheet I create. In another tab of the same websheet I add a different slice that pulls in dollar value data from two other years that I select on the first tab. In the third tab, using just Excel functions, I calculate percentage values. I go back to my first tab that I will use for input and again using just Excel functions, add a section that averages two years of data to give me a trend value. I use the SUBMN function for the Year drivers and layer in formulas to make the defaults for them dynamic, while giving me the ability to still select what years I want. I add two Action Buttons, one to set my values in the input area to my Rates cube and the second to run a TI Process that ensures that the Total Year equals 100% for each property. Last, I upload my websheet to the Server Explorer Applications folder called Admin. This will allow me to update my values and see trending in a simple websheet accessed via TM1 Web.

In TM1 Web, I click on my highlighted websheet.
TM1 Websheets

This opens a websheet that looks like this:

TM1 Websheets
In conclusion, websheets can handle a variety of tasks to make your planning and analysis procedures easier to manage and include more robust functionality. In this example, we went from a cube view used to input Seasonal Factor values, to a websheet that gave insight to trends, ease of data entry and assurance of accuracy. I can customize the formatting to make it as pretty as I want. I can export this out as a snapshot or pdf to send to others that may need this info, but do not have access to TM1. I could even print it out and hang it on my refrigerator if magnets stuck on my doors.

Most Excel formulas and formatting are supported in Perspectives and TM1 Web. TM1 Web does render slightly different from Perspectives, so you may need to make some adjustments to get the results you expect.

To learn more about TM1 Worksheet functions,  click here. 

Simplifying Maintenance of Cognos Planning Access Tables w/ D-Cubes


Are you a Cognos Planning client looking to simplify maintenance of your system? As a business analytics consultant specializing in Planning, I despise the maintenance of access tables. It seems like the process is still living in the 1970s. As a result, we are sharing with you today how to simplify maintenance of Cognos Planning Access Tables with D-Cubes. This technique allows for creating and updating Cognos Planning Access Tables by just changing the data values in an Cognos Planning Analyst D-Cube. I believe this is way better than manually maintaining the Access Tables in Cognos Planning!

The Challenge:

First of all, managing Access Tables for large and highly distributed Cognos planning models can be a labor-intensive and time consuming process. This is especially the case when the access rights change frequently. The Contributor Administration Console - Access Tables editor interface can be a royal pain.

The Solution:

Use an IBM Cognos Planning Analyst cube to manage access and update the Contributor Admin Access Tables. Once you create the Analyst D-Cube, you will be able to assign and maintain the data in the cube then export the cube data and import it into Contributor Admin. You can even put the process in a macro to automate it. This method is so much easier, more visual, and flexible. At least that’s my opinion. If you have simple access tables that don’t change, you might just want to use the Contributor Admin Console.

Setting Up Components in Cognos Planning Analyst

Step1 - Create a D-list with 3 items: READ, WRITE, and HIDDEN as shown below:

cognos planning

Please note that you might want to consider also using NO DATA in your d-list.

Step 2 – Create the Access Table cube in IBM Cognos Planning – Analyst by choosing the dimensions on which access levels will be determined.

cognos planning

An example of the Access Table in Contributor Admin is shown above.

In this example, we will create a cube with 2 dimensions (AcctsIncStmt and the e.List). To format this cube for consumption by Contributor Administration Console as an imported Access Table, a D-Cube format is applied. Make sure the D-Cube is open and active. From the Analyst drop-down menus choose: D-Cube > Format. For format type, choose D-List and select the D-List created in Step1. This will leverage the option of Read, Write and Hidden (No Data).

cognos planning

Above the rows correspond to AcctsIncStmt and the columns correspond to the e.List. Page holds the access [READ/WRITE/NODATA/HIDE] because we set up the format as the D-List formatted.

Step 3 Populate the cells in the cube.

The Analyst D-Cube can be populated manually, automatically via d-link, automatically via formula or some combination of the aforementioned. Note that while not every cell needs to be populated, a default access level can be specified when importing this Access Table into Contributor. Any blank values will be populated with the default access level, in this case NO DATA.

"So what's the next step in the process?"

Step 4 – Exporting the Access Table from IBM Cognos Planning - Analyst

Now that the Access Table D-cube has been created and populated, the next step is to export the table from IBM Cognos Planning Analyst in a format that the Cognos Planning Contributor Admin understands. This is accomplished by having the Access Table D-cube you created open and then selecting D-Cube from Analyst Main Menu bar and then choosing Export Selection involves only the items needed for the Access Table. Choose the following options from the Export D-Cube settings:

cognos planning

a) Select Ascii File and then click the ellipsis to set the export file location; b) In the Format box Select Tab as the separator c) Format Column Headings None d) In the Groups box Select Multiple Column e) In the Dimension Order box ensure that the detail dimension is first, the e-list dimension is second (if applicable), and the data dimension is last. The dimension order MUST correspond to the order required by the Access Table.

In order to suppress items that have not been granted READ, WRITE, HIDDEN or NO DATA access levels. Click on the Zeros tab and highlight the line that represents Rows. When the import is executed, the Base Level Access will be set to NO DATA and any untagged dimension items will be set to NO DATA in the Contributor access table.

cognos planning

Import The Exported File Into Contributor Admin

Now that you’ve created the Export file from the Analyst D-Cube defining the access levels, you will import the Access Table data into IBM Cognos Planning – Contributor. To import the tab-delimited file that was created above, open Cognos Planning – Contributor Administration Console, expand the Datastores, expand the application, expand Development, expand Access Tables and Selections, and choose Access Tables.

In Access Tables, select the correct dimension and cubes that the access table will be applied to and select the Import access table radio button. If you are using the eList in the access table, check the Include e.List checkbox.

cognos planning

Next, click the Add button. After the new row has been added, select it and click on the Import button. Click the ellipsis to select the file that was exported from IBM Cognos Planning – Analyst. Choose the Base Access Level that will be applied to the access table that is being imported. You might want to select NO DATA, which will result in NO DATA being applied to any items not defined in the import file. Also, ensure that Options – Import is checked. First row contains columns headers should remain unchecked.

cognos planning

After these selections have been made, click OK and the import should run with no errors. This will stage the updates to the access table but before they are applied in Contributor you will need to run a GTP process.

Note: View of the Access Table is a different order than the order required by the import file.

cognos planning

Automate It

Once you set this process up, you could automate it with a macro. By leveraging an Analyst D-Cube to define the access table you will simplify the maintenance. This is recommended when you have complex access tables, they change often or they just seem inefficient.

In conclusion, these are a few short steps you can create a cube to define and maintain access tables in Analyst, export the definitions and import them into the Contribution Administrator. For more information on Cognos Planning check out the IBM knowledge center for the latest product documentation and user guides.

There's Got To Be A Better Way...

Let’s now talk about how challenging it can be to define the security in Cognos Planning. The maintenance time of just waiting for the GTP processes to run can be frustrating. But there is an alternative. You know IBM has a more powerful planning and budgeting solution, called TM1 (Performance Management). The beauty of TM1 is that assigning security even down to the cell level is much easier. You can make changes in your model and not even have to do a sync or GTP process!

Here at Lodestar Solutions, we feel the future of Cognos Planning is in question! So, we encourage you to start considering and defining your migration to a better tool. Check out our blog on the benefits of TM1 over Cognos Planning to learn more information.

Lodestar Solutions wants you to be educated, so we created a library of videos that will help you evaluate TM1. Check out our Move to TM1 Program to get access to our library of videos and templates and there is NO COST TO YOU. Finally, if you have specific questions, please contact us at and one of our analytics coaches will get right back to you.

For more tips and tricks regarding Cognos Planning...

If you learned something from this blog, we encourage you to refer back to the previous tips in this series.

Check out Tip #1 on what your future plans should be for Cognos Planning by CLICKING THIS LINK.

For Tip #2 on the questions most asked by Cognos Planning clients, learn more by CLICKING THIS LINK.

Refer back to Tip # 3 on simplifying D-List maintenance by CLICKING THIS LINK.

The previous Tip #4 regarding the re-ordering your E-List can be found by CLICKING THIS LINK.

For Tip #5 on why your cubes may not be opening, CLICK THIS LINK to learn more.

Check out Tip #6 on how to zero out data in Planning Contributor, CLICK THIS LINK.

And for our final Tip #7 on making an analyst library copy in Cognos Planning, visit our previous blog by CLICKING THIS LINK.

How to Zero Out all of the Data in Cognos Planning!


Do you need to reset a cube to zero in order to clear out all the data in the contributor so users can start over? Maybe you are updating processes and you require a reset of your cubes in Contributor? Below, we will share how to zero out all of the data in Cognos Planning on a per cube basis only.

It’s an all or nothing option – every cell within a cube will be set to zero. Please note that the option to zero out only a selection of the cube like a version, does not exist. However, you can run an Analyst to Contributor link to zero out a selection of the Cognos Planning Cube. For more information on zeroing out a selection check out this IBM link on how to zero out contributor dcube data on the web. Before you run this process, please ensure your users are ok with a complete reset of the cube to zero.

Steps to Zero Out Data in Cognos Planning Contributor

The first step in zeroing out all of the data in Cognos Planning Contributor is to go to Cognos Planning Contributor Administration. Next, select the appropriate application as noted below. Make sure you have the right application! Then, inside the development folder, go to the import section and select the prepare tab. Next, check the zero data box next to the appropriate cube and click the prepare button at the bottom. These steps are noted below in the screen shot.

Cognos Planning

After you click prepare, the following message will appear.

If you then click on the Job Management folder for the application, you will see the Prepare_Import job running. Once complete, you will need to run a GTP process. (Go to production) by clicking the green arrow at the top. Please note that if you forget to run the GTP process, it will not zero the data out until you do.


There may be a situation arise when you load data via a Cognos Planning Analytics to Contributor link. Then, after you ran the link, you realized you didn't want to process the data all the way to Contributor. Don't worry. You can zero out the data queue. This will basically delete the staged date before you send it to Production. A couple important notes about this process. This should NOT be used in conjunction with the aforementioned Zero Data in a Contributor Model procedure. Reason being that it would result in deleting the import queue created by the previous prepare_import job and the data would not zero.

Steps to Zero Out the Data Queue

The first step is to go to Cognos Planning and then the Contributor Administration area. Then select the appropriate application as noted below. Next, go to the development folder and select import data area. Select the prepared data blocks tab and choose delete import queue. This will delete the staged or prepared data blocks. Also note that a Go To Production process is not required for this action. If you need to, you can fix the data in Cognos Planning Analyst and rerun the Analyst to Contributor link with the good data.

Cognos Planning

Hopefully this helps you understand how to zero out all of the data in Cognos Planning! My recommendation when you do this type of maintenance in Cognos Planning is to always FOCUS! Shut down your email and turn off your phone! I learned that lesson the hard way.

Some food for thought...

If you really want to simplify the maintenance of your budgeting and forecasting models, you should absolutely consider looking at TM1. At Lodestar Solutions, we feel the future of Cognos Planning is in question! Hence, we encourage you to start considering and defining your migration to a better tool. Check out our blog to learn the benefits of TM1 over Cognos Planning.

Lodestar Solutions is wanting you to be educated, so we are creating a library of videos that will help you evaluate TM1. Check out our Move to TM1 Program to get access to our library and don’t forget that there is NO COST TO YOU for this program.

If you have specific questions, please contact us at and one of our analytics coaches will contact you.

If you learned something from this blog, we encourage you to refer back to the previous tips in this series....

Check out Tip #1 on what your future plans should be for Cognos Planning by CLICKING THIS LINK.

For Tip #2 on the questions most asked by Cognos Planning clients, learn more by CLICKING THIS LINK.

Refer back to Tip # 3 on simplifying D-List maintenance by CLICKING THIS LINK.

The previous Tip #4 regarding the re-ordering of your E-List can be found by CLICKING THIS LINK.

For Tip #5 on why your cubes may not be opening, CLICK THIS LINK to learn more.

Are Your IBM Security Certificates About To Expire?


The purpose of SSL certificates (Secure Sockets Layer) are for encrypting data as it is transferred around a network. These SSL certificates also have an expiration date. If the IBM security certificates expire, you may run into issues using certain client side Cognos applications or gaining access to your data and system.

There are many good reasons for keeping your Cognos BI and TM1 systems up to date on fix packs. One of the most important is that it will keep the security (SSL) certificates updated or renewed! (Please note that the most recent fix packs do not have the SSL updates yet. IBM should have updated fix packs soon and we will update when available.)

Below, you will find some easy tips for determining when your certificates expire and how to check what version of TM1 or BI (with fix packs) you are operating.

Tips For Determining Expirations & Operating Versions

First, let’s start with the instructions to check on your expiration dates for the IBM security certificates. The summary instructions can be found by using this IBM Support Link.

Next, you will want to determine what version and fix packs of the Cognos software you are running by following this easy process. Keep in mind that you must be current on support in order to have access to fix packs. If you let this lapse or need assistance, please contact so that we may assist you.

Cognos BI

In regards to Cognos BI, you will need the information found using this IBM support link to compare to your settings.

Open Cognos configuration for BI, hit CTRL-F3, and the properties menu will come up. Click on the installed components tab and compare the numbers listed to the chart.

IBM Security Certificates

Cognos TM1

Cognos TM1 is very similar.  We suggest using this IBM support link to determine the version you are currently using.

Open Cognos configuration for TM1, hit CTRL-F3, and the properties menu will come up. Click on the installed components tab and compare the numbers listed with PLANANALYTICS to the chart.

IBM Security Certificates

Cognos Express

Finally, for determining Cognos Express, open Cognos configuration for Cognos Express, hit CTRL-F3, and the properties menu will come up. Click on the installed components tab and compare the numbers listed with the charts above for BI and TM1. While there is not a specific chart for Cognos Express, the numbers should correlate.

Furthermore, as a general rule for Cognos BI, TM1 or Express, we recommend you being on the most recent fix pack for the version you are running. Ideally, you want to be on the latest version and latest fix packs. For Cognos BI, TM1 and Express, that should be 10.2.2 unless you have upgraded to IBM Cognos Analytics (Cognos 11). The current version of Cognos 11 is 11.0.3 with release 4 coming soon.  Additional fix pack information can be found by visiting the following IBM Fix Central link.

Additional Fix Pack Information

There are also a few other things you can do besides making sure you are on the latest fix packs:

  1. Disable SSL in TM1s.cfg. This is not recommended because it will leave your traffic not protected by the SSL certificate.
  2. Generate your own IBM security certificate and then apply it.
  3. If you are using TM1 10.2.2, upgrade to a 2048 bit SSL as discussed on this IBM Support FAQ link.

Should you need assistance working through these instructions or any other questions regarding IBM security certificates, call Lodestar Solutions at 813-254-2040. If necessary, we will involve one of our technical consultants. We can also install the fix packs to ensure that your systems are up to date with the latest enhancements and updates!

UPDATE – IBM has announced the update fix for this issue (including some enhancements) will be release end of October or early November 2016.