New Hierarchy Level Behavior in Planning Analytics (TM1) Set Editor


Leaf Element Level naming convention is really changing...

It's when I first started teaching the Performance Modeler (PM) class and showing students how to create a TI Process with the Guided Import that I noticed in PM that the leaf level was not Zero.  The highest consolidated level was Zero.  At that time I just thought it was a fluke that PM was designed differently, but as for the rest of TM1, they were going to stick with the numbering convention that Architect used.  Wrong!

You have to be a 10 to be number one.  Huh???

Fast forward and I'm now working with Planning Analytics Workspace and PAx.  Like many consultants, we learned TM1 using Architect.  In Architect, level Zero was synonymous with the leaf level element of a dimension and we all know that TM1 holds numeric data only at the leaf level.  We've gotten very accustom to using the terms "leaf", "n-level" and "Zero-level" interchangeably.  In fact, my first dynamic subset was created using the Filter by Level option and choosing "0" for the leaf elements and even naming the subset "Level_0".

Being Level Zero is tops!

Historically TM1 has labeled its hierarchy levels where level 0 is the leaf level and the highest number is the top level.

This behavior persists today in Architect and Perspectives, but with the new set editor released in the 2nd version of Planning Analytics, the behavior in the set editor now follows industry standard MDX practices where level 0 is the highest level and the highest number is the leaf level. In other words, the reverse of what it was previously.

End users should be educated in this change as to avoid confusion when working with the data.

This issue is best highlighted when editing a set. First let’s look at Architect. In this example the plan_time dimension is opened and all elements are shown.

If we filter by levels as shown below, the 0 level will be the leaf level.

And if we filter the level on 2, in this case, it will return the highest level.

Conversely in the Planning Analytics Set editor in Planning Analytics Workspace or Planning Analytics for Excel, the reverse filtering method is true and is in alignment with standard MDX queries. In this example, we are filtering on level000 which returns the top level as seen below.

And if we filter on level002, it returns the leaf level as seen below.

The set editor also provides an easy method to filter on the leaf level by providing an easy to use Leaf Level option in the filter.

Before we get into some solutions to simplify the end user experience, remember that if users are proficient in MDX syntax including TM1-specific MDX functions, they can always quickly build their expression-based subsets by editing the MDX in the Set Editor.

For example, the following standard MDX expression would return all leaf level elements in our example:


If you wanted to return the top level elements, you could use the following expression:


Again, where 0 is the highest level and 2 is the lowest level in the example.

Conversely, you could use TM1-specific MDX functions to create the old behavior in the set editor’s MDX expression by using the following syntax which returns the leaf level:

{TM1FILTERBYLEVEL( {TM1SUBSETALL( [plan_time] )}, 0)}

Or, the following to return the highest level in our example:

{TM1FILTERBYLEVEL( {TM1SUBSETALL( [plan_time] )}, 2)}

Where 0 is the leaf level and 2 is the top level.

Methods to Simplify the User Experience

Create Named Levels

One method is to give your hierarchy levels meaningful names for your end users.

To do so:

In Architect, ensure you have enabled Display Control Objects under the View menu.

Find and open the }HierarchyProperties cube and then from the Dimensions drop down list, select the dimension you would like to override the naming convention for.

In this case we will select plan_time and provide the following level names.

Save the View as Default and not Private.

Now the plan_time dimension’s hierarchy needs to be updated. To do so, we will run a TI process. Right-click Processes in the left pane and click Create New Process.

Click the Advanced tab and then the Prolog tab.

After the Generated Statements section, enter the following syntax where DimensionName is replaced with the dimension you want to update. RefreshMdxHierarchy('plan_time');

Save the Process with an appropriate name and then click on Run to execute the TI process.

Now in the Planning Analytics Set Editor, the level names will appear in a more meaningful way.

Dealing with Ragged or Unbalanced Hierarchies

In some cases, the hierarchies may present elements at the same level as your consolidated levels which you may not want to display in your results when working in the Set editor.

In our example, we can see this with the Description element being displayed when filtering on the Year level.

A simple way to make the user experience more focused is by creating appropriate subsets.

In this case you could create and save a Public subset for the dimension that only displays the Years. In this case a subset called All Years.

With this selection, users have a focused list of elements to work with which they can also expand and there is no need for the extra clicks involved in filtering.

Notice description is no longer present.

So the moral of this story is that you may as well get on board with the new naming convention and start naming your subsets according to what "Level Zero" means going forward and not keep it as it's legacy term.  This will keep the new subset editor consistent with the way Performance Modeler treats levels with the guided import process of creating TI Processes too.

Get more out of your Planning Analytics/TM1 solution faster

Need to build more analytics into your model, but don't have the time.  Lodestar Solutions offers our Small Project Service.  Chipping away at your "To Build" list with our Small Project Services will get you results faster and cheaper than making a big production creating the "Next Phase Major Build" that usually gets kicked down the line due to cost and time.  If you have additional questions about this service or require assistance immediately, please contact Lodestar at (813)-415-2910 or at

TM1 Websheets to Simplify, Document and Organize TI Process and Tasks


I’m all about easy. It is a phrase I say repeatedly to my clients that are new to IBM Cognos TM1 so that it helps them to focus in on what will be the simplest solution and not to recreate a cumbersome process in new software. This is really a guided example showing how to use TM1 websheets to organize TI Processes, be a tool to document TI Processes, and document TM1 tasks.

As a TM1 admin, you are pretty adept at toggling around cubes and understanding your data. Maybe you even built the whole model and it’s all second nature to you, so you skipped documenting procedures. Well… it all seemed second nature to you at the time, but a year has gone by and you need to prep for your upcoming budget cycle and now you are trying to remember what needs to be updated.

In any event, I want to explain how to simplify and manage tasks and documentation in TM1 with websheets. For this, I utilize Perspectives and the Applications folders of Server Explorer to streamline tasks. The Applications folders are great for grouping tasks with the use of links to documentation (ie Word documents), cube views set up for specific needs, and websheets that can take it all to the next level. By next level, I mean that you can create a websheet that provides data pulled from TM1, add-in Excel formulas to do calculations on that data (thus eliminating writing of additional rules in TM1), have an input area to update data points, include Action Buttons to run TI Processes, and insert notations to document the function. This will not only make your life easier, it will help you to remember how the heck you did something last year, allow you to delegate tasks to other team members and documents procedures to give you piece of mind knowing that you did not leave behind an unrewarding treasure hunt for your former co-workers to figure out how you did something after you won the lottery and left the company. You can feel free to relax and have that Pina Colada on a Tuesday while floating in the pool (note, only for lottery winners).

Here, I will run through where you take your cube that houses rates that are used in an annual budget cycle, from just being a cube for input into  streamlined TM1 Websheets that simplifies the procedure.

Step One: Create cube for rate input and save view specific for input in TM1 Websheets.

TM1 Websheets

What’s missing? Well, now you need to remember that your Seasonal_Factor that you must update is in the Rates cube. That’s not really evident when you first open Server Explorer.

Step Two: Use Applications folder area to organize tasks in TM1 Websheets

TM1 Websheets

Getting closer. You dragged your Seasonal_Factor_Input view up to your Admin folder so that you know it is part of your Admin functions.

What’s missing? Working with just this view, I don’t see the average of the last two years to figure out what is trending in my data. Also, I will either need to put a HOLD on the Total Year so that if I make any changes, I still balance to 100% for the year, else I need to use the Data Spreading at the Total Year level to ensure I am at 100%. This is more effort than I want and it can be prone to error of Total Year not equaling 100%.

TM1 Websheets

Step 3: Create a Websheet and put it in the Applications folder area in TM1 Websheets

Using Perspectives and my view I created earlier, I do a slice to Excel so that TM1 can put in most of the formulas that I need into my websheet I create. In another tab of the same websheet I add a different slice that pulls in dollar value data from two other years that I select on the first tab. In the third tab, using just Excel functions, I calculate percentage values. I go back to my first tab that I will use for input and again using just Excel functions, add a section that averages two years of data to give me a trend value. I use the SUBMN function for the Year drivers and layer in formulas to make the defaults for them dynamic, while giving me the ability to still select what years I want. I add two Action Buttons, one to set my values in the input area to my Rates cube and the second to run a TI Process that ensures that the Total Year equals 100% for each property. Last, I upload my websheet to the Server Explorer Applications folder called Admin. This will allow me to update my values and see trending in a simple websheet accessed via TM1 Web.

In TM1 Web, I click on my highlighted websheet.
TM1 Websheets

This opens a websheet that looks like this:

TM1 Websheets
In conclusion, websheets can handle a variety of tasks to make your planning and analysis procedures easier to manage and include more robust functionality. In this example, we went from a cube view used to input Seasonal Factor values, to a websheet that gave insight to trends, ease of data entry and assurance of accuracy. I can customize the formatting to make it as pretty as I want. I can export this out as a snapshot or pdf to send to others that may need this info, but do not have access to TM1. I could even print it out and hang it on my refrigerator if magnets stuck on my doors.

Most Excel formulas and formatting are supported in Perspectives and TM1 Web. TM1 Web does render slightly different from Perspectives, so you may need to make some adjustments to get the results you expect.

To learn more about TM1 Worksheet functions,  click here. 

Simplifying Maintenance of Cognos Planning Access Tables w/ D-Cubes


Are you a Cognos Planning client looking to simplify maintenance of your system? As a business analytics consultant specializing in Planning, I despise the maintenance of access tables. It seems like the process is still living in the 1970s. As a result, we are sharing with you today how to simplify maintenance of Cognos Planning Access Tables with D-Cubes. This technique allows for creating and updating Cognos Planning Access Tables by just changing the data values in an Cognos Planning Analyst D-Cube. I believe this is way better than manually maintaining the Access Tables in Cognos Planning!

The Challenge:

First of all, managing Access Tables for large and highly distributed Cognos planning models can be a labor-intensive and time consuming process. This is especially the case when the access rights change frequently. The Contributor Administration Console - Access Tables editor interface can be a royal pain.

The Solution:

Use an IBM Cognos Planning Analyst cube to manage access and update the Contributor Admin Access Tables. Once you create the Analyst D-Cube, you will be able to assign and maintain the data in the cube then export the cube data and import it into Contributor Admin. You can even put the process in a macro to automate it. This method is so much easier, more visual, and flexible. At least that’s my opinion. If you have simple access tables that don’t change, you might just want to use the Contributor Admin Console.

Setting Up Components in Cognos Planning Analyst

Step1 - Create a D-list with 3 items: READ, WRITE, and HIDDEN as shown below:

cognos planning

Please note that you might want to consider also using NO DATA in your d-list.

Step 2 – Create the Access Table cube in IBM Cognos Planning – Analyst by choosing the dimensions on which access levels will be determined.

cognos planning

An example of the Access Table in Contributor Admin is shown above.

In this example, we will create a cube with 2 dimensions (AcctsIncStmt and the e.List). To format this cube for consumption by Contributor Administration Console as an imported Access Table, a D-Cube format is applied. Make sure the D-Cube is open and active. From the Analyst drop-down menus choose: D-Cube > Format. For format type, choose D-List and select the D-List created in Step1. This will leverage the option of Read, Write and Hidden (No Data).

cognos planning

Above the rows correspond to AcctsIncStmt and the columns correspond to the e.List. Page holds the access [READ/WRITE/NODATA/HIDE] because we set up the format as the D-List formatted.

Step 3 Populate the cells in the cube.

The Analyst D-Cube can be populated manually, automatically via d-link, automatically via formula or some combination of the aforementioned. Note that while not every cell needs to be populated, a default access level can be specified when importing this Access Table into Contributor. Any blank values will be populated with the default access level, in this case NO DATA.

"So what's the next step in the process?"

Step 4 – Exporting the Access Table from IBM Cognos Planning - Analyst

Now that the Access Table D-cube has been created and populated, the next step is to export the table from IBM Cognos Planning Analyst in a format that the Cognos Planning Contributor Admin understands. This is accomplished by having the Access Table D-cube you created open and then selecting D-Cube from Analyst Main Menu bar and then choosing Export Selection involves only the items needed for the Access Table. Choose the following options from the Export D-Cube settings:

cognos planning

a) Select Ascii File and then click the ellipsis to set the export file location; b) In the Format box Select Tab as the separator c) Format Column Headings None d) In the Groups box Select Multiple Column e) In the Dimension Order box ensure that the detail dimension is first, the e-list dimension is second (if applicable), and the data dimension is last. The dimension order MUST correspond to the order required by the Access Table.

In order to suppress items that have not been granted READ, WRITE, HIDDEN or NO DATA access levels. Click on the Zeros tab and highlight the line that represents Rows. When the import is executed, the Base Level Access will be set to NO DATA and any untagged dimension items will be set to NO DATA in the Contributor access table.

cognos planning

Import The Exported File Into Contributor Admin

Now that you’ve created the Export file from the Analyst D-Cube defining the access levels, you will import the Access Table data into IBM Cognos Planning – Contributor. To import the tab-delimited file that was created above, open Cognos Planning – Contributor Administration Console, expand the Datastores, expand the application, expand Development, expand Access Tables and Selections, and choose Access Tables.

In Access Tables, select the correct dimension and cubes that the access table will be applied to and select the Import access table radio button. If you are using the eList in the access table, check the Include e.List checkbox.

cognos planning

Next, click the Add button. After the new row has been added, select it and click on the Import button. Click the ellipsis to select the file that was exported from IBM Cognos Planning – Analyst. Choose the Base Access Level that will be applied to the access table that is being imported. You might want to select NO DATA, which will result in NO DATA being applied to any items not defined in the import file. Also, ensure that Options – Import is checked. First row contains columns headers should remain unchecked.

cognos planning

After these selections have been made, click OK and the import should run with no errors. This will stage the updates to the access table but before they are applied in Contributor you will need to run a GTP process.

Note: View of the Access Table is a different order than the order required by the import file.

cognos planning

Automate It

Once you set this process up, you could automate it with a macro. By leveraging an Analyst D-Cube to define the access table you will simplify the maintenance. This is recommended when you have complex access tables, they change often or they just seem inefficient.

In conclusion, these are a few short steps you can create a cube to define and maintain access tables in Analyst, export the definitions and import them into the Contribution Administrator. For more information on Cognos Planning check out the IBM knowledge center for the latest product documentation and user guides.

There's Got To Be A Better Way...

Let’s now talk about how challenging it can be to define the security in Cognos Planning. The maintenance time of just waiting for the GTP processes to run can be frustrating. But there is an alternative. You know IBM has a more powerful planning and budgeting solution, called TM1 (Performance Management). The beauty of TM1 is that assigning security even down to the cell level is much easier. You can make changes in your model and not even have to do a sync or GTP process!

Here at Lodestar Solutions, we feel the future of Cognos Planning is in question! So, we encourage you to start considering and defining your migration to a better tool. Check out our blog on the benefits of TM1 over Cognos Planning to learn more information.

Lodestar Solutions wants you to be educated, so we created a library of videos that will help you evaluate TM1. Check out our Move to TM1 Program to get access to our library of videos and templates and there is NO COST TO YOU. Finally, if you have specific questions, please contact us at and one of our analytics coaches will get right back to you.

For more tips and tricks regarding Cognos Planning...

If you learned something from this blog, we encourage you to refer back to the previous tips in this series.

Check out Tip #1 on what your future plans should be for Cognos Planning by CLICKING THIS LINK.

For Tip #2 on the questions most asked by Cognos Planning clients, learn more by CLICKING THIS LINK.

Refer back to Tip # 3 on simplifying D-List maintenance by CLICKING THIS LINK.

The previous Tip #4 regarding the re-ordering your E-List can be found by CLICKING THIS LINK.

For Tip #5 on why your cubes may not be opening, CLICK THIS LINK to learn more.

Check out Tip #6 on how to zero out data in Planning Contributor, CLICK THIS LINK.

And for our final Tip #7 on making an analyst library copy in Cognos Planning, visit our previous blog by CLICKING THIS LINK.

How to Copy and Archive Cognos Planning Models


That time of the year is approaching when you either need to rollover your budget or you need to create new Cognos Planning Models. This requires you to make a Cognos Planning Analyst library copy. We have some clients also asking, “Can you remind me how to copy and Archive Cognos Planning Models?” However, before we start with some instructions and tips to creating a Cognos Planning Model Analyst library copy, let’s talk about a few key points.

This approach may or may not be useful for Production libraries. Deleting the objects will also delete the data. However, not deleting the objects could result in objects remaining in the Production library that are no longer needed or used. Depending on the timing, it may be better to create a new production library into which the Development objects are copied. The prior Production Cognos Planning Models library could serve as an archive or historical library.

D-Links, Reports, and Macros...

Inter-library D-Links (source and target have different libraries), reports (linkage between more than one library), and macros (objects referenced reside in more in than one library) will require additional steps. D-Lists, which reside in a library other than where the associated d-cubes reside, will also require additional steps. However, at this point in time, the only inter-library D-cube D-list references are to the d-lists found in the Dev Common.

If unsure of the contents of a library, it is best to check the integrity. Checking the integrity of the Cognos Planning Models library will alert one to objects in other libraries with which the current library shares a reference(s).

To Check the Integrity of Objects in Cognos Planning Models...

In Analyst, go to File/Library/Objects and select the Appropriate Library. Select the Appropriate Object type. In most cases, All should be selected. It is possible to check the integrity of a specific object(s). In that instance, move only the object(s) against which an integrity check needs to be run.

Move All Objects to the bottom half of screen via the down arrow icon in middle of screen.

cognos planning models

Select the highlighted exclamation point in the middle of the screen to check the integrity. Depending on the size of library, this process could take a minute or two for completion.

cognos planning models

The following screen will appear. Review the results of the screen, especially for unexpected references e.g. a d-link saved to the source library, not the target library. It is usually most helpful to look at the Probable and Suspects. Required can also be helpful but keep in mind that if a d-cube is required, it will also list all the d-lists and even d-list formats required.

A determination needs to made for any references to other libraries; should the reference be broken/severed, copied or moved as is, copied to a new library so that the referenced item(s) is (are) updated or modified in some other manner.

cognos planning models

To Delete Objects From Library - Cognos Planning Models:

File/Library/Objects. Select the appropriate Library. With referential integrity of the system, it is usually necessary to delete objects in order:

  1. Reports

  2. Macros

  3. D-Links

  4. A-Tables

  5. Save Selections

  6. D-Cubes

  7. D-Lists

  8. File Maps (as they can be used to update d-lists)

  9. Formats

Next, select the Object Type and move the objects to the bottom half of screen via the down arrow icon in middle of screen. It is possible to delete all objects from a library at the same time by moving the objects to the bottom half of the screen based on the order above. However, the “All Objects” method works best with self-contained, uncomplicated libraries.

cognos planning models
cognos planning models

Select the Red X in the middle of the screen to delete the objects.

If the Object cannot be deleted due to references to other objects, a message similar to the one below will be returned:

cognos planning models

The reference(s) must be broken before the object can be deleted. (In some instances the referenced object can be deleted at the same time.)

If the object can be deleted, the following message will be returned. Select Yes to complete the deletion.

cognos planning models

Follow the steps above until all objects have been deleted.

To Copy Objects - Cognos Planning Models:

File/Library/Objects and select the Appropriate Library. Select the Object Type. All Objects or Specific Objects can be selected.

To aid in filtering out objects that should not be copied, most notably the “z’d” objects, click on the filter icon (funnel with an =) at the top of the screen.

cognos planning models

Under the Name drop down, select either <> or =.  As noted, * is the wildcard so <> z* will return all objects that do not begin with z. (It is possible to use *criteria* if the string may be found anywhere within the objects’ name.) Select Okay.

All items not beginning with a z are returned.

Move the objects to the bottom half of screen via the down arrow icon in middle of screen.

Cognos Planning Models

Select the Copy icon from the middle of the screen.

cognos planning models

Select the Target Library (the library to which the objects will be copied) and ensure the BETWEEN and FROM boxes are checked.

Ok....what's next?

If D-cubes already exist in the target library and the data in those D-cubes should not be overwritten, also select Do not overwrite existing D-Cube data. The structure (D-lists comprising the D-cube, D-cube itself, and D-links referencing it) will be updated with the results of the copy but the underlying data will not be copied over. Data integrity may be lost if the underlying structure of the new cube differs from the existing cube i.e. new calculations or items no longer on a D-list.

cognos planning models

If objects with the same name are found to already exist in the target library, the following warning will be returned. The name of the object must be identical. If a D-list originally named fund is renamed funds, Analyst will not recognize the rename during the copy and both objects will exist in the target library.

Answering Yes will overwrite the existing objects with the new objects. Answering no will stop the copy process.

​If target items should not be replaced, the items need to be removed from the source copy. Select Cancel from the copy window.

cognos planning models

Remove the objects from the bottom half of the selection screen by using the up arrow.

cognos planning models

Once the list of objects to be copied has been modified, proceed as before. Once the initial copy has been completed, the newly copied D-links need to have their source modified to reflect the same environment i.e. training, development, production.

To Modify a D-Links Source

Using the File/Library/Object* method (see previous), select the D-links to be copied.

  • *File/Library/D-Links can also be used in this instance.

Note: Any D-links with a source existing in another library are prefixed with i*_. Integrated Expense D-links have a prefix of ie_. The filter (see previous) can be used to select the relevant D-links.

The Copy default is the BETWEEN option. Leave the default [BETWEEN] and select the original source library. If there is more than one source library for the set of D-links, each must be handled separately.

cognos planning models

Once the D-Links are copied to the original source library, select them [from the original source library] and copy them to the corresponding source library in the new environment e.g. from Dev TA Revenue to Training TA Revenue.

Ensure both BETWEEN and FROM are selected.

cognos planning models

Once the D-Links are copied to the new source library, select them [from the new source library].

Select the move icon from the middle of the window.

cognos planning models

Move them to the corresponding target library in the new environment e.g. from Training TA Revenue to Training Integrated Expense.

What's the next step?

Delete the D-Links from the original source library (the library to which the D-link were copied in step 1.) Step 1 can be done via Move instead of copy. Steps 2 through 4 remain the same regardless. Steps 1 through 4 can also be used to modify Reports with links to other libraries and Macros with objects in other libraries.

If a D-cube has D-lists residing in other libraries and the D-list references need to be modified, copy the D-cube using only BETWEEN objects, to the library in which the D-lists reside (Step 1). Copy the D-cube and the associated D-list(s) to the new D-list source library (Step 2) using BETWEEN and FROM. Move the D-cube back to the ‘D-cube starting’ library (Step 3). Delete the D-cube only from the D-list’s original source library (Step 4).

If self-contained libraries are desired, use the Copy Wizard. File/Library/Copy.

If additional information is needed before deleting an object, determine which objects are used by the object. Use the Show Using icon.

cognos planning models

If additional references are needed, select the down arrow or level 1 through 4.

cognos planning models

An alternative to using the Show Using option to determine the target and source of a D-link is to select the D-links and move them to the bottom of the window. This can be used to quickly determine if the aforementioned process was successful.

Finally, and to save paper, select print preview and view the source and target of the D-links on the screen.

cognos planning models

If you are looking for more reference material check out the IBM Cognos Planning documentation on IBM’s site.

So WHY this need to copy libraries so often? Is there a better solution?

Are you running out of size in Cognos Planning Models? You can’t even add one more year to the model without blowing it up? Maybe you have too many sales people and just need to break your sales model into two separate applications? Well, the writing is on the wall! You have outgrown your Cognos Planning Models!

The good news, however, is IBM has an even better solution for you! IBM Cognos Performance Management, more commonly known as TM1, handles MUCH bigger models! It has the ability to hold 256 dimensions in a single cube.

We encourage you to start considering and defining your migration to a better tool. We suggest hopping over to our blog to learn the benefits of TM1 over Cognos Planning.

A 24/7 Video Library At No Cost To You....

Lodestar Solutions is also creating a library of videos and templates that will help you evaluate TM1. Check out our Move to TM1 Program where there is NO COST TO YOU! If you do, however, have specific questions, please contact us at and one of our analytics coaches will reach out to you.

Lodestar Solutions is working to educate readers with our blogs so we encourage you to refer back to the previous tips in this series....

Check out Tip #1 on what your future plans should be for Cognos Planning by CLICKING THIS LINK.

For Tip #2 on the questions most asked by Cognos Planning clients, learn more by CLICKING THIS LINK.

Refer back to Tip # 3 on simplifying D-List maintenance by CLICKING THIS LINK.

The previous Tip #4 regarding the re-ordering your E-List can be found by CLICKING THIS LINK.

For Tip #5 on why your cubes may not be opening, CLICK THIS LINK to learn more.

And for last week's Tip #6 on how to zero out data in Planning Contributor, CLICK THIS LINK.

How to Zero Out all of the Data in Cognos Planning!


Do you need to reset a cube to zero in order to clear out all the data in the contributor so users can start over? Maybe you are updating processes and you require a reset of your cubes in Contributor? Below, we will share how to zero out all of the data in Cognos Planning on a per cube basis only.

It’s an all or nothing option – every cell within a cube will be set to zero. Please note that the option to zero out only a selection of the cube like a version, does not exist. However, you can run an Analyst to Contributor link to zero out a selection of the Cognos Planning Cube. For more information on zeroing out a selection check out this IBM link on how to zero out contributor dcube data on the web. Before you run this process, please ensure your users are ok with a complete reset of the cube to zero.

Steps to Zero Out Data in Cognos Planning Contributor

The first step in zeroing out all of the data in Cognos Planning Contributor is to go to Cognos Planning Contributor Administration. Next, select the appropriate application as noted below. Make sure you have the right application! Then, inside the development folder, go to the import section and select the prepare tab. Next, check the zero data box next to the appropriate cube and click the prepare button at the bottom. These steps are noted below in the screen shot.

Cognos Planning

After you click prepare, the following message will appear.

If you then click on the Job Management folder for the application, you will see the Prepare_Import job running. Once complete, you will need to run a GTP process. (Go to production) by clicking the green arrow at the top. Please note that if you forget to run the GTP process, it will not zero the data out until you do.


There may be a situation arise when you load data via a Cognos Planning Analytics to Contributor link. Then, after you ran the link, you realized you didn't want to process the data all the way to Contributor. Don't worry. You can zero out the data queue. This will basically delete the staged date before you send it to Production. A couple important notes about this process. This should NOT be used in conjunction with the aforementioned Zero Data in a Contributor Model procedure. Reason being that it would result in deleting the import queue created by the previous prepare_import job and the data would not zero.

Steps to Zero Out the Data Queue

The first step is to go to Cognos Planning and then the Contributor Administration area. Then select the appropriate application as noted below. Next, go to the development folder and select import data area. Select the prepared data blocks tab and choose delete import queue. This will delete the staged or prepared data blocks. Also note that a Go To Production process is not required for this action. If you need to, you can fix the data in Cognos Planning Analyst and rerun the Analyst to Contributor link with the good data.

Cognos Planning

Hopefully this helps you understand how to zero out all of the data in Cognos Planning! My recommendation when you do this type of maintenance in Cognos Planning is to always FOCUS! Shut down your email and turn off your phone! I learned that lesson the hard way.

Some food for thought...

If you really want to simplify the maintenance of your budgeting and forecasting models, you should absolutely consider looking at TM1. At Lodestar Solutions, we feel the future of Cognos Planning is in question! Hence, we encourage you to start considering and defining your migration to a better tool. Check out our blog to learn the benefits of TM1 over Cognos Planning.

Lodestar Solutions is wanting you to be educated, so we are creating a library of videos that will help you evaluate TM1. Check out our Move to TM1 Program to get access to our library and don’t forget that there is NO COST TO YOU for this program.

If you have specific questions, please contact us at and one of our analytics coaches will contact you.

If you learned something from this blog, we encourage you to refer back to the previous tips in this series....

Check out Tip #1 on what your future plans should be for Cognos Planning by CLICKING THIS LINK.

For Tip #2 on the questions most asked by Cognos Planning clients, learn more by CLICKING THIS LINK.

Refer back to Tip # 3 on simplifying D-List maintenance by CLICKING THIS LINK.

The previous Tip #4 regarding the re-ordering of your E-List can be found by CLICKING THIS LINK.

For Tip #5 on why your cubes may not be opening, CLICK THIS LINK to learn more.

Are Your IBM Security Certificates About To Expire?


The purpose of SSL certificates (Secure Sockets Layer) are for encrypting data as it is transferred around a network. These SSL certificates also have an expiration date. If the IBM security certificates expire, you may run into issues using certain client side Cognos applications or gaining access to your data and system.

There are many good reasons for keeping your Cognos BI and TM1 systems up to date on fix packs. One of the most important is that it will keep the security (SSL) certificates updated or renewed! (Please note that the most recent fix packs do not have the SSL updates yet. IBM should have updated fix packs soon and we will update when available.)

Below, you will find some easy tips for determining when your certificates expire and how to check what version of TM1 or BI (with fix packs) you are operating.

Tips For Determining Expirations & Operating Versions

First, let’s start with the instructions to check on your expiration dates for the IBM security certificates. The summary instructions can be found by using this IBM Support Link.

Next, you will want to determine what version and fix packs of the Cognos software you are running by following this easy process. Keep in mind that you must be current on support in order to have access to fix packs. If you let this lapse or need assistance, please contact so that we may assist you.

Cognos BI

In regards to Cognos BI, you will need the information found using this IBM support link to compare to your settings.

Open Cognos configuration for BI, hit CTRL-F3, and the properties menu will come up. Click on the installed components tab and compare the numbers listed to the chart.

IBM Security Certificates

Cognos TM1

Cognos TM1 is very similar.  We suggest using this IBM support link to determine the version you are currently using.

Open Cognos configuration for TM1, hit CTRL-F3, and the properties menu will come up. Click on the installed components tab and compare the numbers listed with PLANANALYTICS to the chart.

IBM Security Certificates

Cognos Express

Finally, for determining Cognos Express, open Cognos configuration for Cognos Express, hit CTRL-F3, and the properties menu will come up. Click on the installed components tab and compare the numbers listed with the charts above for BI and TM1. While there is not a specific chart for Cognos Express, the numbers should correlate.

Furthermore, as a general rule for Cognos BI, TM1 or Express, we recommend you being on the most recent fix pack for the version you are running. Ideally, you want to be on the latest version and latest fix packs. For Cognos BI, TM1 and Express, that should be 10.2.2 unless you have upgraded to IBM Cognos Analytics (Cognos 11). The current version of Cognos 11 is 11.0.3 with release 4 coming soon.  Additional fix pack information can be found by visiting the following IBM Fix Central link.

Additional Fix Pack Information

There are also a few other things you can do besides making sure you are on the latest fix packs:

  1. Disable SSL in TM1s.cfg. This is not recommended because it will leave your traffic not protected by the SSL certificate.
  2. Generate your own IBM security certificate and then apply it.
  3. If you are using TM1 10.2.2, upgrade to a 2048 bit SSL as discussed on this IBM Support FAQ link.

Should you need assistance working through these instructions or any other questions regarding IBM security certificates, call Lodestar Solutions at 813-254-2040. If necessary, we will involve one of our technical consultants. We can also install the fix packs to ensure that your systems are up to date with the latest enhancements and updates!

UPDATE – IBM has announced the update fix for this issue (including some enhancements) will be release end of October or early November 2016.

Cognos TM1 Blogs


Below is a  massive list of IBM Cognos TM1  blogs. To search for a specific topic, use the search bar to the right.

New in Planning Analytics Workspace 79
Written by Mike Bernaiche, September 15th 2022 On September 1, 2022, IBM released Planning Analytics Workspace 79 or 2.0.79.  There[...]
Coming Soon in Planning Analytics Applications and Plans
Written by Mike Bernaiche, July 29th 2022 Recently, I had the opportunity to see what was coming in Planning Analytics[...]
New in Planning Analytics Workspace Version 77 and 78
Written by Mike Bernaiche, July 28th 2022 It is that time again!  Providing you everything new in planning analytics workspace[...]
What’s New in Planning Analytics Workspace 75 and 76
Written by Mike Bernaiche, May 19th 2022 Typically, once a month, IBM updates Planning Analytics Workspace.  In April version 75[...]
What’s New in Planning Analytics Workspace 73 and 74
Written by Mike Bernaiche, April 7th 2022 Typically, once a month, IBM updates Planning Analytics Workspace.  In February version 73[...]
IBM Planning Analytics Versions Are End of Life
Written by Heather L. Cole, March 3rd 2022 Are you an IBM Planning Analytics TM1 user that has not upgraded[...]
Scenario Planning with IBM Planning Analytics
Written by Heather L. Cole, November 4th 2021Scenario Planning with IBM Planning Analytics Did the Pandemic disrupt your business?  Did[...]
What’s New in Planning Analytics Workspace Version 67 and 68
Written by Mike Bernaiche, October 21st 2021 Over the last couple of months, IBM has released 2 updates to Planning[...]
Forecast in Planning Analytics with AI
Written by Mike Bernaiche, October 14th 2021 Did you know that you can forecast in Planning Analytics Workspace by using[...]
Scorecards and Metrics in Planning Analytics
Written by Mike Bernaiche, September 30th, 2021Did you know that you can track your key performance indicators (KPI’s) with Scorecards[...]
What’s New in IBM Planning Analytics Workspace Version 64, 65 and 66
Written by Mike Bernaiche, August 19th 2021IBM continues to update Planning Analytics Workspace on a regular basis.  The pattern has[...]
How to Burst Reports from IBM Planning Analytics
Written by Heather L. Cole on August 5th, 2021 Are you an IBM Planning Analytics client and wondering how to[...]
How to Connect TM1 to Power BI
Written by Heather Cole, May 13th 2021Have you been asked to pass the TM1 data to Power BI, Tableau, Qlik[...]
Planning Analytics Workspace – Now What?
April 30th, 2021 Recently, it came to my attention that some Planning Analytics (formerly TM1) clients are not clear about[...]
What’s New in Planning Analytics Workspace Version 61 and 62
Written by Mike Bernaiche on April 23rd, 2021IBM continues to release updates every 30 to 60 days for Planning Analytics[...]
How to Easily Move Development Changes to Production in IBM Planning Analytics TM1 with Soterre
February 4th, 2021 TM1 Administrators of the IBM Planning Analytics that follow best practices should be using a development environment[...]
Version Control and Audit Trail for IBM Planning Analytics TM1 with Soterre
January 28th, 2021 Are you a TM1 Administrator and looking for a better way to know which Modeler changed a[...]
What TM1 Licenses Include IBM Planning Analytics Workspace?
January 7th, 2021 Looking for a better user interface to your powerful Cognos TM1/ Planning Analytics data?  Do your users[...]
Guide to Creating a Virtual Hierarchy
November 5th, 2020 In todays planning and forecasting world, the ability to change quickly and provide valuable data to your[...]
What Kind of Audit Trail is Available for Cognos Planning Analytics (TM1)?
October 29th, 2020 A budgeting and planning solution should always provide a good audit trail.  So, what kind of audit[...]
Sandboxes in Planning Analytics Workspace
October 1st, 2020 Think back to when you were younger playing in a sandbox.  Grabbing your plastic shovels and building[...]
Input and Spread Data in Planning Analytics Workspace
September 24th, 2020  Now that you have read the introduction to Planning Analytics Workspace (PAw) blog found here, it is time[...]
Benefits to Using IBM Planning Analytics Workspace
September 17th, 2020 If you are a TM1 or Planning Analytics client and you are not using Planning Analytics Workspace,[...]
Planning Analytics Excel Options
August 20th, 2020 Many of you use Excel when working with Planning Analytics.  You may be creating budgets, exporting files,[...]
Why Buy TM1 Connect?
July 31st, 2020 We are often asked; how can I bring my TM1 data into other software and create reports[...]
Why is TM1 Slow?
Are you a TM1 user and frustrated with the speed? Are your perspective reports extremely slow? Does opening a cube[...]
Creating the COVID Forecast with Cognos Planning Analytics, TM1
April 30th, 2020 ​Wow, the world has changed in just a few weeks!  Our way of life as we knew[...]
Cognos Planning Analytics for Excel
March 12, 2020Upgrading TM1, checkout Cognos Planning Analytics for Excel PAX Are you a long time TM1 user looking to[...]
Move Action Buttons from TM1 Perspectives to Planning Analytics for Excel (PAX)
March 5th, 2020Are you planning your upgrade from TM1 to Planning Analytics? After you review all the benefits, steps and[...]
IBM Let the Cognos TM1 Grandfather Clause Die Putting Old Clients at Risk!
February 20th, 2020​To all you very old TM1 clients I urge you to check your IBM entitlements!  I recently learned[...]
Neither Cognos BI nor Microsoft Power BI can Replace Cognos Planning Analytics (TM1)!
February 13th, 2020Recently I met with a client who owned and was using both IBM Cognos BI and TM1.  Their[...]
Is it Time to Redesign Your TM1 Planning System?
December 12th, 2019As business analytics coaches it is our passion to talk about latest innovations in the financial planning and[...]
Complete Model Development in Planning Analytics Workspace
November 11, 2019As business analytics coaches it is our passion to talk about latest innovations in the financial planning and[...]
Edit Dimensions and Others Greyed Out in Planning Analytics Architect
September 20, 2019The problem - Edit Dimensions and Others are GreyWe get this question from time to time - Why[...]
What does it cost to upgrade from TM1 to IBM Planning Analytics?
September 10, 2019“Hey, I heard a rumor that after September 2019 my TM1 10.2.x will not be supported. What does[...]
Model Directly in IBM Planning Analytics Workspace. A one stop shop
1/22/2019 Are you asking yourself if it is  possible to build a model directly in IBM Planning Analytics Workspace? There is[...]
7 Reasons Why You Should Purchase Workspace (PAw)
 9/11/2018 7 Reasons Why You Should Purchase Workspace (PAw)You have probably heard a lot about Planning Analytics Workspace (PAw).  However, like[...]
Dashboards in TM1? Legacy Users Need to Purchase Workspace Add-On.
7/10/18 Are you a Legacy TM1 shop and your users are asking about dashboards in TM1?  For all of you[...]
New Hierarchy Level Behavior in Planning Analytics (TM1) Set Editor
3/20/2018Leaf Element Level naming convention is really changing...It's when I first started teaching the Performance Modeler (PM) class and showing[...]
Have You Secured Funding Yet? It’s Budgeting Season, Secure Funding Now For IBM Cognos
Have You Secured Funding Yet?  It's Budget Season.  Secure Funding Now for IBM Cognos!​Do you need process improvement and require[...]
Lodestar Solutions is Hiring – TM1 Consultant Needed
5/11/2017TM1 Consultant OverviewAre you an expert in Cognos TM1? Do you have advanced knowledge of Microsoft Excel, Access or SQL[...]
TM1 User Maintained Public Subsets
3/6/2017 As a TM1 Admin have you ever wished for user maintained public subsets? User maintained public subsets puts the[...]
Recreating TM1 Alternate Hierarchies
2/8/2017Cognos TM1 allows alternate hierarchies which, from a user perspective, is greatly appreciated but from a TM1 Admin perspective, probably[...]
TM1 Websheets to Simplify, Document and Organize TI Process and Tasks
12/22/2016I’m all about easy. It is a phrase I say repeatedly to my clients that are new to IBM Cognos[...]
Are Your IBM Security Certificates About To Expire?
10/4/2016The purpose of SSL certificates (Secure Sockets Layer) are for encrypting data as it is transferred around a network. These[...]
Top TM1 Automation Techniques Used by Expert Consultants
9/20/2016 As a TM1 modeler, you likely have more tasks to do in a day than there is time. You[...]
Why Backup Your TM1 Development Environment?
6/18/2016Hello… Atif Hameed here for Lodestar Solutions. As business analytics coaches, it is our passion to talk about things that[...]
How to: Deleting or Adding Dimensions to TM1 Cubes
6/1/2016Today’s topic is about deleting or adding dimensions to TM1 cubes that you are using in Production. With TM1’s introduction of Performance[...]
Creating a Flexible TM1 Allocation Model
5/3/2016This past week I had to script several TI Processes to make allocations as flexible as possible. This was done[...]
Update TM1 Contributor Views with TI Process
4/13/2016I recently had a client ask how they could go about updating the latest month for users to see when[...]
Why to Upgrade From Excel to Cognos TM1 or Express
  Most people have a love/hate relationship with Excel. They love what it can do but hate when it fails[...]
Naming Conventions in TM1 – Keep Your Data Clean!
2/9/2016As business analytics coaches, it is our passion to talk about things that make your job easier. Today we are[...]
TM1 Data Export in Multiple Columns
2/2/2016In TM1 (aka Performance Management), you can use a Turbo Integrator (TI) process to not only load data, but to[...]
Cognos Command Center – True Integration Capability
1/6/2016As a business analytics coach at Lodestar Solutions, it is our passion to talk about things that make your job[...]
IBM Cognos TM1 10.2.2 Fix Pack 4 Is Available
11/17/15It's here. IBM Cognos TM1 10.2.2 Fix Pack 4 is now available and, along with a list of all the bug fixes[...]
TM1 Perspective Websheets & Active Forms
​TM1 Perspective Websheets & Active FormsTM1 Perspective Websheets & Active Forms can be extremely helpful to take your TM1 data[...]
TM1 Drill Through From Cube to Cube Not Working?
9/30/15If you are going to utilize the Cognos TM1 Architect drill through functionality from one cube to another, there are some considerations[...]
IBM TM1 Applications Web: Better Tool Today
9/2/15As Business Analytics Coaches, it’s our passion and responsibility to find and talk about products that make your job easier.[...]
Understanding Cognos TM1 Licensing
8/26/15​Cognos TM1 licensing - Enterprise vs. ExpressTo better understand the IBM Cognos TM1 licensing functionality and structure, we’ll break it[...]
Moving to Cognos TM1 from Transformer/Powerplay: Pros and Cons
8/4/15As a Business Analytics Coach and resident Transformer/Powerplay expert, I’m often asked about the advantages or disadvantages to moving from[...]
Cognos TM1 TurboIntegrator – Database Table Manners
7/29/15 Cognos TM1 TurboIntegrator (TI) is a great ETL tool that allows you to bring in data to TM1. It[...]
In-Memory Cube Technology = Faster Performance!
Reporting, analysis, budgeting and planning solutions for many years have leveraged multi-dimensional cubes similar to Excel pivot tables but better. [...]
Cognos TM1 TI Processes Using Cubes for Dynamic Variables
4/28/15TM1 cubes can be used for more than calculations and analysis.  They can be utilized as parameter variables for TI[...]
Why Did They Add Cognos Scorecarding to TM1?
4/24/15​Why Did They Add Cognos Scorecarding to TM1?IBM Cognos Metric Studio has been part of the IBM Cognos BI solutions[...]
TM1 ARCHITECT – When Do You Need It?
4/16/15As business analytics coaches, it is our passion to discuss topics that make your job easier. Today we are going[...]
TM1 Operations Console Replaces TM1 Top
3/24/15As a Business Analytics Coach at Lodestar Solutions, it’s my passion and responsibility to enlighten you about functionality that is[...]
Cognos TM1 Reporting Capabilities with BI
As a Lodestar Solutions business analytics coach, clients often ask us to help decipher the intersection of  IBM Cognos licenses[...]
TM1Connect Extract Data From TM1 for Reporting & Analysis
As Business Analytics Coaches, it’s our passion and responsibility to find and talk about products that compliment your existing tool[...]
TM1 Performance Modeler – Creating Model Flow Charts
Using Performance Modeler to create TM1 Model Flow ChartsTwo of our recent blogs described the great benefits of TM1Compare and[...]
The TM1 Migration Process Understood; Can TM1Compare Help?
As Business Analytics coaches at Lodestar Solutions, it’s our passion and responsibility to find and talk about products that make[...]
Documenting TM1 with Pulse by Cubewise
Documenting TM1 with Pulse As Business Analytics Coaches, it is Lodestar Solution's passion and responsibility to find and talk about products[...]
Documenting TM1 with QUBEdocs
QUBEdocs - is this the solution to some of your TM1 challenges? One of the biggest challenges with TM1 has[...]
Publishing TM1 Aliases in Cognos BI
Have you been thinking about publishing TM1 aliases in Cognos BI?  It is completely unintuitive, but if you want to[...]
New Tool for Importing Cognos Planning into TM1
Are you using Cognos Enterprise Planning and looking to move to IBM Cognos Performance Management/TM1?  There is a new tool[...]
Cognos TM1 Sales Compensation Model
As much as sales people say, “I’m not a numbers person. I’ll just let you accountants handle that”, you start[...]
Do you need a TM1 application health check?
There can be many reasons a particular TM1 application has performance issues. TM1 gets its power from the in-memory engine.[...]
What is the IBM Cognos Analytics Server ICAS?
​What is the IBM Cognos Analytics Server ICAS?With all the massive changes by IBM to the IBM Cognos Licensing, people[...]
TM1 Period to Date Dimensions
As Business Analytics coaches for Lodestar Solutions, we come across questions that are asked of us by our clients. A common question[...]
TM1 10.2 Security Changes
With TM1 10.2 security changes, TM1 10.2 now allows you to share/re-use the same dimension in multiple approval applications.  This[...]
How to Improve Your Planning Model
For many in the Finance department, this is the time of year that you start your preparations for next year’s[...]
Cognos BI Reporting – TM1 Data Source, Levels & Alternative Options
Today, we are going to discuss Cognos BI reporting and how it relates to TM1. As many of you know,[...]
Make More Money with TM1 Skills
How can I make more money with TM1 skills?  Learning Cognos TM1 on-the-job can give you the opportunity to advance[...]
TM1 Action Button Merging Parameters
​TM1 Action ButtonTM1 Quirk of the Day: Parameters Merging in TM1 Action Button Properties When using TM1 10.2, I’ve encountered[...]
Cognos TM1 Turbointegrator Processes Quirks
As our Cognos Coaches come across little (and big) “gotcha’s”, we like to share them with you.  Below are a[...]
TM1 Scorecarding and Cognos Metric Studio
TM1 Scorecarding and Cognos Metric Studio, a look into the future!You might ask yourself, what do I need to know[...]
Saving TM1 rules takes too long!
"What's the deal with the TM1 rules I'm writing?!?!" Why does saving TM1 rules take a long time occasionally? I[...]
TM1 Principle Element Name Dilemmas
There are dilemmas when using Skeys, ID # and Name in conjunction with creating a Principal Element Name and/or  Aliases. [...]
Tricks to Working with DAYNO Function in TM1
Tricks to Working with DAYNO Function in TM1In case you encounter some frustration creating a TM1 rule to use the[...]
DimensionElementPrincipleName – What The Heck Is It?
I recently had an issue with a TM1 TI Process that did not want to load data where the data[...]
Exporting TM1 Dimensions
Exporting TM1 Dimensions - quick and easy way to do this through a TI processI was wondering aloud recently about Exporting[...]
TM1 Implementation Methodology for Success
All Cognos TM1 implementations require a strategy to deliver an appropriate reporting solution. This is the plan that will ensure[...]
Departmental Data Warehouse Options
Are you a data chaser?  Are you spending hours of your time grabbing data from various data sources, throwing the[...]
Migrating IBM Cognos Planning to TM1
Are you thinking about moving from IBM Cognos Planning to TM1? Well, let me tell you, for those of you who have been using Cognos Planning, moving to TM1 will be like going from having a pot-luck dinner to a catered gourmet dinner. The biggest difference is the cube sizes. TM1 can handle HUGE cubes of data. And yes, size does matter. If you currently have planning models where you created separate applications and use Admin Links to pull your data together because the models were too big for Planning, you will LOVE TM1. Lodestar has migrated a large number of Planning clients to TM1 and many have been able to consolidate their models to simplify maintenance. You still have the ability to limit users to what they need to view and edit, but it will all be in the same database.
TM1 Element Security Within a Dimension
SECURITY_ELEMENT Note: If }ElementSecurity_dimensionname Control Cube does not exist this must be executed first Assigns Security to Elements within a Dimension Creates/Updates }ElementSecurity_dimensionname
Use TM1 Cubes as Data Dictionary & Analysis Report Usage
For many companies, the same data/metric term has different meanings for different departments. An example of this would be “Customer”. To the sales department, it may mean they have a new customer that has been entered into the system and now has an account number. To Finance, “Customer” may mean that this person has been sent at least one shipment within the last 12 months.
Cascade Cognos TM1 Security from Parent to all Children
Turbo Integrator's security cascades from parent to all children and is dimension specific. After the initial security (access) is loaded, the level of the Dept variable (1st column in the text file) is checked. If the level is greater than zero, the elements in the Department dimension are cycled.
Unable to Launch Cognos TM1 Perspectives via Microsoft?
From time to time, Microsoft updates may cause an error and not allow you to use TM1 Perspectives. If you encounter this, please follow the instructions below:
Don't Drive Blind – Incorporate Planning Data with TM1
Every company is wrapped up in their own data. How do we capture our data? How do we analyze our data? How do we report on our data? It seems like the ultimate goal in many companies is how to report on their data and get through the monthly meetings with a pretty presentation.
TM1 Dynamic Subset with Dimensional Data
You create a dynamic subset with a specific consolidated item and the next time the dimension is updated, the subset takes on a mind of its own. . .and you recreate it. Been there, done that – right? Try the following:
Creating TM1 Dynamic Subset of Individual Elements
Most of us have been there - laboriously scrolling through a dimension containing dozens or hundreds of elements. The goal is to select 3 or 4 elements for creating that one of a kind subset. A few minutes of scrolling . . .success. Then you realize you left out one item. And so the scroll and click begins once again. Ick. Here’s how to replace scroll and click. . .
IBM Cognos Express Approval Hierarchy Considerations
The same strategy works when using TM1, or in this case, Planner, within IBM Cognos Express. Unlike the Enterprise version of Cognos where the administrator can create new TM1 server instances, only a single TM1 service is allowed in Express. An approval hierarchy can be used only once on a given TM1 server. Therefore, if a dimension is assigned to a Planner application as its approval hierarchy, that dimension cannot be used again as an approval hierarchy in any other application.
Cognos TM1 Data Security
The ability to access cube data in TM1 changes significantly once the TM1 application is deployed. While we are just starting off building our model in TM1 Performance Modeler, the cells within our cube are writable as long as they are not a calculated element. Below in our Revenue cube, we can manually enter data into any of the cells that appear white.
Copying a TM1 Service to Another Server
Sometimes retaining an image can be a nice trip down memory lane, but in the case of copying a TM1 service from one environment to another, it can create an error. We recently built a simple TM1 model and deployed the TM1 Web application for use as a demonstration model. We wanted to keep a backup copy of it on another server. We copied over all of the data and log files to the second machine, including the tm1.cfg file. We then added the new service instance to IBM Cognos Configuration for TM1.
Create TM1 Dimensions in Excel Worksheet
Though the preferred method for creating dimensions in TM1 is via a Turbo Integrator Process, sometimes you may want to do this with a dimension worksheet. With a dimension worksheet, you can incorporate Excel functionality to do concatenations that will populate your attributes. This works nicely for Aliases where you would like to use the element name with a description. The example below shows you how to do this for an Accounts dimension.
Creating A New TM1 Server
I could understand the frustration of a new modeler who could not gain access to the new TM1 server he had just created. He closely followed the set of instructions he had again and again to no avail. First, he had copied the tm1s.cfg file from an existing operating model on the same network, changing the ServerName, PortNumber, LoggingDirectory and DatabaseDirectory within that new file to reflect the name and location of the new server he was setting up. Second, he created a new server instance in IBM Cognos Configuration and edited the location to point to the directory from the tm1s.cfg file resided.
Different Methods to TM1 Model Development
TM1 allows for different preferences and with the release of version 10, offers the end user yet another way of developing a model. Performance Modeler is even more user friendly than the familiar modeling tools of Architect and the Excel add-in Perspectives. Performance Modeler is an excellent path to take for the beginning modeler. Its many built-in wizards will guide the end user through the building and populating of cubes. Experienced modelers may prefer to stick with the familiar environment of Architect, instead of learning the nuances of a new tool. Many finance folks are married to Excel and want to remain in a formula driven workspace.
Stored Procedures in TM1 Turbo Integrator
As many of you know, when you use Turbo Integrator to import data and metadata into TM1, you can bring it in from many sources. You can use comma-delimited ASCII files, other TM1 cubes and views, MSAS (Microsoft Analysis Service) and relational database tables that are accessible through an ODBC connection. For this blog, we are going to concentrate on relation database tables through SQL and what you may not know is that using multiple tables and writing multiple joins query in SQL using the Turbo Integrator query box can make the pull of the data very slow!
Migrating Planning to TM1
IBM Cognos TM1 is architected very differently than Cognos Planning. Plannings distributed architecture and cube structures lead to limitation in model sizes. IBM Cognos TM1 is a powerful 64-bit, in memory solutions that has tremendous power if model are designed considering TM1s power and functionality. For Cognos Planning clients to fully realize the power of TM1, Lodestar Solutions recommends a model “Reconstruction".
What's New: Cognos Insight 10.1.1
What’s New in IBM Cognos Insight 10.1.1? 1. Percentage of Total Calculations 2. Calculations in Parent 3. Time Dimension Import 4. Chart Interaction 5. Improved Usability Features a. Simplified Terminology b. Access to Formatting c. Keyboard Shortcuts 6. Guided Import 7. IBM Cognos Insight Trial Edition
Execute Multiple TM1 TI Processes with Parallel Loading
As a TM1 power user, you are aware that a Turbo Integrator process sets a lock to the cube, which is loaded in a dimension maintenance function. This lock can cause end users to wait until the TI process is complete and the lock has been released. If you have a large amount of data to load, this can be very annoying to the end users, seeing that they have to wait a long time. So, it is recommended that you separate the Metadata maintenance from the data maintenance in separate TI processes and run them as a chore.
TM1 Contributor Rights and Functionality Explained
TM1 Enterprise Planning Contributor is the primary licenses end users of TM1 utilize. It allows for end users to have read and write access to the TM1 models. Under the TM1 Contributor licensing, End users have the following access rights:
What is Cognos TM1 Performance Modeler?
IBM Cognos TM1 10 includes a component called the Performance Modeler. It’s a great step in the right direction for IBM to simplify the model building process in TM1. It’s a wizard based component that will allow less technical people to manage and maintain models. However, if you are an existing IBM Cognos TM1 user who is considering moving to IBM Cognos TM1 Performance Modeler, you should be aware of the following:
Consultant's Cognos TM1 Tricks (Part 2)
6. LEVERAGING CONTROL CUBES FOR SECURITY - Control Cubes can be useful to view attribute or security information. If migrating from Planning, it may even be possible to use an existing Excel spreadsheet to load security via a TI Process [into the Control Cube].
Consultant's Cognos TM1 Tricks (Part 1)
CREATING ALIASES - When creating aliases, use consistent attribute names. Name or Desc will usually suffice. (We frequently find that name is used on one dimension, desc is used on a couple others and description is used on a few more.) This reduces those “uh oh, what did I call that attribute” moments when creating a process or updating rules.
Incorporate TM1 Naming Conventions in Your Model Build
TM1 is great for organizing your planning models. Make the most of this and incorporate a naming convention for your dimensions, views and TI processes. This makes building, maintenance and training much easier in the long run, especially when you have more than one Admin. Design a simple blueprint of your naming convention at the beginning of your implementation and keep it in your application folder for quick reference. Another tip is to use the underscore in lieu of a space in titles, whether it be your cubes, dimensions or the title row of your data import. For example, when creating a dimension using TI, I prefer that the variable name to show up as “Element_Name”, instead of “v1” if the underscore is not used, as would be the case if the column title was “Element Name” vs “Element_Name”.
Moving from IBM Cognos Planning to TM1 Part 2
This is the second piece to the story on why an IBM Cognos Planning client would consider moving to IBM Cognos TM1. Power User Benefits- The power user license is the TM1 Modeler. The role is similar the Enterprise Planning Modeler in that this person typically is responsible for building, modifying and deploying the TM1 Planning model to the end users (Contributors).
Moving From IBM Cognos Planning to TM1 Part 1
There’s been a lot of discussion on why an IBM Cognos Planning client would consider moving to IBM Cognos TM1. So, I thought I would do a little brain dump on my experience with clients. Today, I’ll start with the End User. End User Benefits- End users can choose from a couple different user interfaces. I will focus on the TM1 Contributor user interface for TM1 as it closely resembles Planning Contributor. It includes some functionality we have been requesting for years.
8 Keys to Successful Cognos Dashboards
The purpose of Organizational Dashboards is to create clear visibility in your organization which leads to better decision making. So why do so many organization struggle with Dashboards? Effective Dashboards require: Reliable and timely source data Measures should motivate behavior to achieve Key Performance Indicators Clear definitions of measures – ie Does “Sales” include returns and allowances? continued ...
Integrating TM1 9.5 and Cognos BI – Part 2
​Integrating TM1 9.5 and Cognos BI – Part 2We continue from the previous Lodestar blog on Integrating TM1 9.5 and[...]
Integrating TM1 9.5 and Cognos BI – Part 1
IBM Cognos 10 greatly improves the performance of BI reports that use TM1 as a data source, but many clients aren’t leveraging the integration in their Cognos environment. Below are Lodestar guidelines:
Minimal IT for TM1 Implementation
We are constantly being asked by finance and other areas of the business if they can implement IBM Cognos TM1 without IT involvement and the answer is NO! However, with proper planning you can implement with minimal involvement. The key is to use IT strategically. They are very valuable to your project as they often have insight to your data structures, sources and potential data integrity challenges. When IT is actively involved in the project and partnering with the business, success is rapidly accomplished.
Parallel Interaction in TM1 9.5.2 Improves Write Performance
Looking for performance improvement in IBM Cognos TM1?  IBM added a new configuration in TM1 9.5.2 to improve write performance[...]
Improve TM1 Performance Via Cube Dependencies
During Server load, TM1 establishes cube dependencies based on feeder statements in rule files. Feeder statements that have data dependent cube references or rules files containing ATTRN or ATTRS functions will not establish Cube Dependencies during server load – rather they will be established during a first time query or data update invoking these cross-cube rules . The result can be that a query or Turbo Integrator Process could trigger a cube dependency where the system goes out and evaluates what the dependencies are required before giving results. If this happens during periods of user activity, this could block objects and cause contention issues for concurrent readers and writers. So, the user will complain the system is slow!
How to Kill a TM1 Operation – TM1 Top
The TM1 Top utility enables you to dynamically monitor the threads/operations running in an instance of the TM1 server. TM1 Top is a stand-alone utility that runs within a console (command) window on a Microsoft Windows system. It is designed to make minimal demands on the TM1 server and the supporting network and system. With the exception of a user-initiated login process, TM1 Top does not use any cube or dimension resources in the TM1 server, and does not use or interact with the data or locks on the TM1 server.

Update TM1 Contributor Views with TI Process


I recently had a client ask how they could go about updating the latest month for users to see when going into a TM1 Contributor application, without having to update each view individually. Well then…let's discuss TM1 Contributor views. To update the default element that is used in a title/page dimension of a cube view, you will use the Turbo Integrator function ViewTitleElementSet.   This allows you to update the element chosen for the page, without eliminating the other elements in the subset used that would be available if you were to click on the dropdown for choices.  If you have several views that use the same dimension, you would simply add them to your TI Process.  You can leverage the Parameters tab to simplify things further if you have more than one title/page dimension in views that need to be updated.  The example below takes you through the steps for updating one view, but you can update many views at once in a single process.

This is the syntax of the function:

            ViewTitleElementSet(CubeName, ViewName, DimName, Index);

TM1 Contributor Views w/ TI Process
TM1 Contributor Views w/ TI Process
TM1 Contributor Views w/ TI Process
TM1 Contributor Views w/ TI Process
TM1 Contributor Views w/ TI Process
TM1 Contributor Views w/ TI Process

TM1 defaults all user views in a “persisted state”. This means if the user rearranged the default view, as a “favor” it leaves the view the way one left it. So your users will still have to reset each view (or choose the option to reset all views) in order to see your newly updated view after running your TI Process.

At this time, TM1 does not have the magic Reset button some may want or need for their users. If you feel bold and want to try the existing work around, please click here for the IBM documentation:

Resetting All Persisted Views for an IBM Cognos TM1 Application

Should you have any questions about TM1 Contributor views or any other TM1 questions, feel free to contact us at

The Power of Cognos Insight


Do you struggle with trying to analyze offline spreadsheets or data sources? Do you receive data from 3rd parties that need to be analyzed quickly?  Do you want to look like a rock star and quickly create a TM1 cube and dashboard on this data?  How would you feel if you were able to do all of this and more including publishing to the Cognos network for consumption throughout the organization? Let us introduce Cognos Insight.

IBM Cognos Insight can make all of this happen and more! Let me tell you about an internal struggle I had recently and how Cognos Insight has become a daily tool within Lodestar Solutions. 

It was time to build Lodestar’s 2016 Budget and I wanted to be able to quickly build a budget that could quickly be implemented and shared throughout our organization. I utilized files form our accounting software, dragged them in, manipulated the hierarchies a little and very quickly created an actuals cube that was the basis for our 2016 budget.  Once the budget process was complete, I took that info and created a scorecard which is utilized daily at Lodestar Solutions.  Updating certain numbers weekly and monthly, our entire organization has insight into our business.  Upper management can react to changing trends quickly while the service team can see who is doing what and what projects are in the que. Additionally, the entire company can have conversations and create actions based on fact, not assumption.  Doesn’t that sound like a powerful tool that can make you shine in your company?

Lodestar Solutions can help you with Cognos Insight. We have an awesome workshop available for you to access which will teach you how to use it in a couple of hours.  It includes files and instruction to create cubes, calculations and dashboards. Lodestar Solutions also provides a link to download the free version of Cognos Insight.  This version has all features of the paid version except the ability to publish back to Cognos and pull data from BI or TM1.  Also one important item of note, if you own BI or TM1… you own the full version of Cognos Insight.

Welcome screen with lots of valuable info:

Cognos Insight welcome screen

Example of Scorecard in Cognos Insight:

 Cognos Insight example scorecard

Another screenshot example:

Cognos Insight screen example

Contact us at for information on our Cognos Insight workshop and start utilizing this very powerful desktop tool! You can also view our Lodestar Solutions Youtube video HERE.

Naming Conventions in TM1 – Keep Your Data Clean!


Naming conventions in TM1As business analytics coaches, it is our passion to talk about things that make your job easier. Today we are going to talk about best practices in naming conventions in TM1 and application management. Naming conventions in TM1 are effective if and when adopted early. Once you have an established environment, it is difficult to go and update and/or change names of objects. It may sound trivial to change or update names but it’s most definitely not. As you may know, changing the names of well-established objects in a TM1 Ecosystem requires unravelling multiple layers of objects that are interconnected. Starting early and sticking to a naming standard will help in future activities like error resolution, optimizing the environment, adding capabilities and finally the most important of all, documentation & knowledge transfer.

Choosing a naming convention requires all developers involved agreeing on the naming convention. Each object in the application has to be named using the same method. In order to take a systematic approach to this, what we do is take the lowest level in objects and go up from there.

Elements – This is a good place to start because it is the lowest level object in the application. It is advisable to choose the element principal names as a digit code and straight from the source. If the digit code is a description, use that as an Alias. If the description is not unique, amount all the elements then concatenate it with the digit code and make it unique. There are many other ways to make things unique. You need the description to be unique because you can’t have a duplicate Alias on TM1. Refrain from using special characters in element names. If your metadata source has special characters, replace them or omit them during the dimension build during the ETL process.

Subsets – Keep subsets to the minimum. We have noticed with many clients, that there are duplicate subsets meaning there are subsets that have the same elements in them. Keep the subset list clean by deleting duplicates. Use uniform names for dynamic subsets. Try to be descriptive in the name without making the name too long. Prefix dynamic by labeling them as such. Most subsets should be created on the fly in a turbo integrator process. This limits the amount of subsets to a great deal.

Dimensions – If a dimension is specific to a cube, name the dimension with the cube name in it. For example, if an account dimension is for the P&L cube, using something like PL_Account is a good idea. That way all the P&L specific dimensions are listed together in the dimension list.

Cubes – This should follow the same pattern as the dimensions. For example, cubes should be prefixed with the type of cube it is. (PL_XXXX, BS_XXXX etc.)

TI Process and Chores – Special attention should be paid to these objects since they are looked for when a specific activity has to be performed on the application. These activities can span from anything related to maintain dimensions and loading data to extracting data. Having a good naming regiment here is essential for yourself and your predecessors. A good method is to prefix the name with the action being performed. For example, a dimension action can be prefixed with DIM_, a data Load can be prefixed with LOAD_.  If you have a large application with a mix of different types of cubes, you can prefix the process names with the name of the cube. (P&L_Load, P&L_Dim_Account) This kind of naming convention ensures objects are clustered together and makes life easier for everyone.

Following a structured approach that is agreed upon by all developers in the application is how you keep a sustainable TM1 ecosystem. Please feel free to reach out to us at for details on naming conventions in TM1 or any other items.