New Hierarchy Level Behavior in Planning Analytics (TM1) Set Editor

3/20/2018

Leaf Element Level naming convention is really changing...

It's when I first started teaching the Performance Modeler (PM) class and showing students how to create a TI Process with the Guided Import that I noticed in PM that the leaf level was not Zero.  The highest consolidated level was Zero.  At that time I just thought it was a fluke that PM was designed differently, but as for the rest of TM1, they were going to stick with the numbering convention that Architect used.  Wrong!

You have to be a 10 to be number one.  Huh???

Fast forward and I'm now working with Planning Analytics Workspace and PAx.  Like many consultants, we learned TM1 using Architect.  In Architect, level Zero was synonymous with the leaf level element of a dimension and we all know that TM1 holds numeric data only at the leaf level.  We've gotten very accustom to using the terms "leaf", "n-level" and "Zero-level" interchangeably.  In fact, my first dynamic subset was created using the Filter by Level option and choosing "0" for the leaf elements and even naming the subset "Level_0".

Being Level Zero is tops!

Historically TM1 has labeled its hierarchy levels where level 0 is the leaf level and the highest number is the top level.

This behavior persists today in Architect and Perspectives, but with the new set editor released in the 2nd version of Planning Analytics, the behavior in the set editor now follows industry standard MDX practices where level 0 is the highest level and the highest number is the leaf level. In other words, the reverse of what it was previously.

End users should be educated in this change as to avoid confusion when working with the data.

This issue is best highlighted when editing a set. First let’s look at Architect. In this example the plan_time dimension is opened and all elements are shown.

If we filter by levels as shown below, the 0 level will be the leaf level.

And if we filter the level on 2, in this case, it will return the highest level.

Conversely in the Planning Analytics Set editor in Planning Analytics Workspace or Planning Analytics for Excel, the reverse filtering method is true and is in alignment with standard MDX queries. In this example, we are filtering on level000 which returns the top level as seen below.

And if we filter on level002, it returns the leaf level as seen below.

The set editor also provides an easy method to filter on the leaf level by providing an easy to use Leaf Level option in the filter.

Before we get into some solutions to simplify the end user experience, remember that if users are proficient in MDX syntax including TM1-specific MDX functions, they can always quickly build their expression-based subsets by editing the MDX in the Set Editor.

For example, the following standard MDX expression would return all leaf level elements in our example:

[plan_time].[plan_time].levels(2).MEMBERS

If you wanted to return the top level elements, you could use the following expression:

[plan_time].[plan_time].levels(0).MEMBERS

Again, where 0 is the highest level and 2 is the lowest level in the example.

Conversely, you could use TM1-specific MDX functions to create the old behavior in the set editor’s MDX expression by using the following syntax which returns the leaf level:

{TM1FILTERBYLEVEL( {TM1SUBSETALL( [plan_time] )}, 0)}

Or, the following to return the highest level in our example:

{TM1FILTERBYLEVEL( {TM1SUBSETALL( [plan_time] )}, 2)}

Where 0 is the leaf level and 2 is the top level.


Methods to Simplify the User Experience

Create Named Levels

One method is to give your hierarchy levels meaningful names for your end users.

To do so:

In Architect, ensure you have enabled Display Control Objects under the View menu.

Find and open the }HierarchyProperties cube and then from the Dimensions drop down list, select the dimension you would like to override the naming convention for.

In this case we will select plan_time and provide the following level names.

Save the View as Default and not Private.

Now the plan_time dimension’s hierarchy needs to be updated. To do so, we will run a TI process. Right-click Processes in the left pane and click Create New Process.

Click the Advanced tab and then the Prolog tab.

After the Generated Statements section, enter the following syntax where DimensionName is replaced with the dimension you want to update. RefreshMdxHierarchy('plan_time');

Save the Process with an appropriate name and then click on Run to execute the TI process.

Now in the Planning Analytics Set Editor, the level names will appear in a more meaningful way.

Dealing with Ragged or Unbalanced Hierarchies

In some cases, the hierarchies may present elements at the same level as your consolidated levels which you may not want to display in your results when working in the Set editor.

In our example, we can see this with the Description element being displayed when filtering on the Year level.

A simple way to make the user experience more focused is by creating appropriate subsets.

In this case you could create and save a Public subset for the dimension that only displays the Years. In this case a subset called All Years.

With this selection, users have a focused list of elements to work with which they can also expand and there is no need for the extra clicks involved in filtering.

Notice description is no longer present.

So the moral of this story is that you may as well get on board with the new naming convention and start naming your subsets according to what "Level Zero" means going forward and not keep it as it's legacy term.  This will keep the new subset editor consistent with the way Performance Modeler treats levels with the guided import process of creating TI Processes too.

Get more out of your Planning Analytics/TM1 solution faster

Need to build more analytics into your model, but don't have the time.  Lodestar Solutions offers our Small Project Service.  Chipping away at your "To Build" list with our Small Project Services will get you results faster and cheaper than making a big production creating the "Next Phase Major Build" that usually gets kicked down the line due to cost and time.  If you have additional questions about this service or require assistance immediately, please contact Lodestar at (813)-415-2910 or at Services@LodestarSolutions.com.

TM1 Websheets to Simplify, Document and Organize TI Process and Tasks

12/22/2016

I’m all about easy. It is a phrase I say repeatedly to my clients that are new to IBM Cognos TM1 so that it helps them to focus in on what will be the simplest solution and not to recreate a cumbersome process in new software. This is really a guided example showing how to use TM1 websheets to organize TI Processes, be a tool to document TI Processes, and document TM1 tasks.

As a TM1 admin, you are pretty adept at toggling around cubes and understanding your data. Maybe you even built the whole model and it’s all second nature to you, so you skipped documenting procedures. Well… it all seemed second nature to you at the time, but a year has gone by and you need to prep for your upcoming budget cycle and now you are trying to remember what needs to be updated.

In any event, I want to explain how to simplify and manage tasks and documentation in TM1 with websheets. For this, I utilize Perspectives and the Applications folders of Server Explorer to streamline tasks. The Applications folders are great for grouping tasks with the use of links to documentation (ie Word documents), cube views set up for specific needs, and websheets that can take it all to the next level. By next level, I mean that you can create a websheet that provides data pulled from TM1, add-in Excel formulas to do calculations on that data (thus eliminating writing of additional rules in TM1), have an input area to update data points, include Action Buttons to run TI Processes, and insert notations to document the function. This will not only make your life easier, it will help you to remember how the heck you did something last year, allow you to delegate tasks to other team members and documents procedures to give you piece of mind knowing that you did not leave behind an unrewarding treasure hunt for your former co-workers to figure out how you did something after you won the lottery and left the company. You can feel free to relax and have that Pina Colada on a Tuesday while floating in the pool (note, only for lottery winners).

Here, I will run through where you take your cube that houses rates that are used in an annual budget cycle, from just being a cube for input into  streamlined TM1 Websheets that simplifies the procedure.

Step One: Create cube for rate input and save view specific for input in TM1 Websheets.

TM1 Websheets

What’s missing? Well, now you need to remember that your Seasonal_Factor that you must update is in the Rates cube. That’s not really evident when you first open Server Explorer.

Step Two: Use Applications folder area to organize tasks in TM1 Websheets

TM1 Websheets

Getting closer. You dragged your Seasonal_Factor_Input view up to your Admin folder so that you know it is part of your Admin functions.

What’s missing? Working with just this view, I don’t see the average of the last two years to figure out what is trending in my data. Also, I will either need to put a HOLD on the Total Year so that if I make any changes, I still balance to 100% for the year, else I need to use the Data Spreading at the Total Year level to ensure I am at 100%. This is more effort than I want and it can be prone to error of Total Year not equaling 100%.

TM1 Websheets

Step 3: Create a Websheet and put it in the Applications folder area in TM1 Websheets

Using Perspectives and my view I created earlier, I do a slice to Excel so that TM1 can put in most of the formulas that I need into my websheet I create. In another tab of the same websheet I add a different slice that pulls in dollar value data from two other years that I select on the first tab. In the third tab, using just Excel functions, I calculate percentage values. I go back to my first tab that I will use for input and again using just Excel functions, add a section that averages two years of data to give me a trend value. I use the SUBMN function for the Year drivers and layer in formulas to make the defaults for them dynamic, while giving me the ability to still select what years I want. I add two Action Buttons, one to set my values in the input area to my Rates cube and the second to run a TI Process that ensures that the Total Year equals 100% for each property. Last, I upload my websheet to the Server Explorer Applications folder called Admin. This will allow me to update my values and see trending in a simple websheet accessed via TM1 Web.

In TM1 Web, I click on my highlighted websheet.
TM1 Websheets

This opens a websheet that looks like this:

TM1 Websheets
In conclusion, websheets can handle a variety of tasks to make your planning and analysis procedures easier to manage and include more robust functionality. In this example, we went from a cube view used to input Seasonal Factor values, to a websheet that gave insight to trends, ease of data entry and assurance of accuracy. I can customize the formatting to make it as pretty as I want. I can export this out as a snapshot or pdf to send to others that may need this info, but do not have access to TM1. I could even print it out and hang it on my refrigerator if magnets stuck on my doors.

Most Excel formulas and formatting are supported in Perspectives and TM1 Web. TM1 Web does render slightly different from Perspectives, so you may need to make some adjustments to get the results you expect.

To learn more about TM1 Worksheet functions,  click here. 

Simplifying Maintenance of Cognos Planning Access Tables w/ D-Cubes

11/8/2016

Are you a Cognos Planning client looking to simplify maintenance of your system? As a business analytics consultant specializing in Planning, I despise the maintenance of access tables. It seems like the process is still living in the 1970s. As a result, we are sharing with you today how to simplify maintenance of Cognos Planning Access Tables with D-Cubes. This technique allows for creating and updating Cognos Planning Access Tables by just changing the data values in an Cognos Planning Analyst D-Cube. I believe this is way better than manually maintaining the Access Tables in Cognos Planning!

The Challenge:

First of all, managing Access Tables for large and highly distributed Cognos planning models can be a labor-intensive and time consuming process. This is especially the case when the access rights change frequently. The Contributor Administration Console - Access Tables editor interface can be a royal pain.

The Solution:

Use an IBM Cognos Planning Analyst cube to manage access and update the Contributor Admin Access Tables. Once you create the Analyst D-Cube, you will be able to assign and maintain the data in the cube then export the cube data and import it into Contributor Admin. You can even put the process in a macro to automate it. This method is so much easier, more visual, and flexible. At least that’s my opinion. If you have simple access tables that don’t change, you might just want to use the Contributor Admin Console.

Setting Up Components in Cognos Planning Analyst

Step1 - Create a D-list with 3 items: READ, WRITE, and HIDDEN as shown below:

cognos planning

Please note that you might want to consider also using NO DATA in your d-list.

Step 2 – Create the Access Table cube in IBM Cognos Planning – Analyst by choosing the dimensions on which access levels will be determined.

cognos planning

An example of the Access Table in Contributor Admin is shown above.

In this example, we will create a cube with 2 dimensions (AcctsIncStmt and the e.List). To format this cube for consumption by Contributor Administration Console as an imported Access Table, a D-Cube format is applied. Make sure the D-Cube is open and active. From the Analyst drop-down menus choose: D-Cube > Format. For format type, choose D-List and select the D-List created in Step1. This will leverage the option of Read, Write and Hidden (No Data).

cognos planning

Above the rows correspond to AcctsIncStmt and the columns correspond to the e.List. Page holds the access [READ/WRITE/NODATA/HIDE] because we set up the format as the D-List formatted.

Step 3 Populate the cells in the cube.

The Analyst D-Cube can be populated manually, automatically via d-link, automatically via formula or some combination of the aforementioned. Note that while not every cell needs to be populated, a default access level can be specified when importing this Access Table into Contributor. Any blank values will be populated with the default access level, in this case NO DATA.

"So what's the next step in the process?"

Step 4 – Exporting the Access Table from IBM Cognos Planning - Analyst

Now that the Access Table D-cube has been created and populated, the next step is to export the table from IBM Cognos Planning Analyst in a format that the Cognos Planning Contributor Admin understands. This is accomplished by having the Access Table D-cube you created open and then selecting D-Cube from Analyst Main Menu bar and then choosing Export Selection involves only the items needed for the Access Table. Choose the following options from the Export D-Cube settings:

cognos planning

a) Select Ascii File and then click the ellipsis to set the export file location; b) In the Format box Select Tab as the separator c) Format Column Headings None d) In the Groups box Select Multiple Column e) In the Dimension Order box ensure that the detail dimension is first, the e-list dimension is second (if applicable), and the data dimension is last. The dimension order MUST correspond to the order required by the Access Table.

In order to suppress items that have not been granted READ, WRITE, HIDDEN or NO DATA access levels. Click on the Zeros tab and highlight the line that represents Rows. When the import is executed, the Base Level Access will be set to NO DATA and any untagged dimension items will be set to NO DATA in the Contributor access table.

cognos planning

Import The Exported File Into Contributor Admin

Now that you’ve created the Export file from the Analyst D-Cube defining the access levels, you will import the Access Table data into IBM Cognos Planning – Contributor. To import the tab-delimited file that was created above, open Cognos Planning – Contributor Administration Console, expand the Datastores, expand the application, expand Development, expand Access Tables and Selections, and choose Access Tables.

In Access Tables, select the correct dimension and cubes that the access table will be applied to and select the Import access table radio button. If you are using the eList in the access table, check the Include e.List checkbox.

cognos planning

Next, click the Add button. After the new row has been added, select it and click on the Import button. Click the ellipsis to select the file that was exported from IBM Cognos Planning – Analyst. Choose the Base Access Level that will be applied to the access table that is being imported. You might want to select NO DATA, which will result in NO DATA being applied to any items not defined in the import file. Also, ensure that Options – Import is checked. First row contains columns headers should remain unchecked.

cognos planning

After these selections have been made, click OK and the import should run with no errors. This will stage the updates to the access table but before they are applied in Contributor you will need to run a GTP process.

Note: View of the Access Table is a different order than the order required by the import file.

cognos planning

Automate It

Once you set this process up, you could automate it with a macro. By leveraging an Analyst D-Cube to define the access table you will simplify the maintenance. This is recommended when you have complex access tables, they change often or they just seem inefficient.

In conclusion, these are a few short steps you can create a cube to define and maintain access tables in Analyst, export the definitions and import them into the Contribution Administrator. For more information on Cognos Planning check out the IBM knowledge center for the latest product documentation and user guides.

There's Got To Be A Better Way...

Let’s now talk about how challenging it can be to define the security in Cognos Planning. The maintenance time of just waiting for the GTP processes to run can be frustrating. But there is an alternative. You know IBM has a more powerful planning and budgeting solution, called TM1 (Performance Management). The beauty of TM1 is that assigning security even down to the cell level is much easier. You can make changes in your model and not even have to do a sync or GTP process!

Here at Lodestar Solutions, we feel the future of Cognos Planning is in question! So, we encourage you to start considering and defining your migration to a better tool. Check out our blog on the benefits of TM1 over Cognos Planning to learn more information.

Lodestar Solutions wants you to be educated, so we created a library of videos that will help you evaluate TM1. Check out our Move to TM1 Program to get access to our library of videos and templates and there is NO COST TO YOU. Finally, if you have specific questions, please contact us at Coaching@lodestarsolutions.com and one of our analytics coaches will get right back to you.

For more tips and tricks regarding Cognos Planning...

If you learned something from this blog, we encourage you to refer back to the previous tips in this series.

Check out Tip #1 on what your future plans should be for Cognos Planning by CLICKING THIS LINK.

For Tip #2 on the questions most asked by Cognos Planning clients, learn more by CLICKING THIS LINK.

Refer back to Tip # 3 on simplifying D-List maintenance by CLICKING THIS LINK.

The previous Tip #4 regarding the re-ordering your E-List can be found by CLICKING THIS LINK.

For Tip #5 on why your cubes may not be opening, CLICK THIS LINK to learn more.

Check out Tip #6 on how to zero out data in Planning Contributor, CLICK THIS LINK.

And for our final Tip #7 on making an analyst library copy in Cognos Planning, visit our previous blog by CLICKING THIS LINK.

How to Zero Out all of the Data in Cognos Planning!

10/25/2016

Do you need to reset a cube to zero in order to clear out all the data in the contributor so users can start over? Maybe you are updating processes and you require a reset of your cubes in Contributor? Below, we will share how to zero out all of the data in Cognos Planning on a per cube basis only.

It’s an all or nothing option – every cell within a cube will be set to zero. Please note that the option to zero out only a selection of the cube like a version, does not exist. However, you can run an Analyst to Contributor link to zero out a selection of the Cognos Planning Cube. For more information on zeroing out a selection check out this IBM link on how to zero out contributor dcube data on the web. Before you run this process, please ensure your users are ok with a complete reset of the cube to zero.

Steps to Zero Out Data in Cognos Planning Contributor

The first step in zeroing out all of the data in Cognos Planning Contributor is to go to Cognos Planning Contributor Administration. Next, select the appropriate application as noted below. Make sure you have the right application! Then, inside the development folder, go to the import section and select the prepare tab. Next, check the zero data box next to the appropriate cube and click the prepare button at the bottom. These steps are noted below in the screen shot.

Cognos Planning

After you click prepare, the following message will appear.

If you then click on the Job Management folder for the application, you will see the Prepare_Import job running. Once complete, you will need to run a GTP process. (Go to production) by clicking the green arrow at the top. Please note that if you forget to run the GTP process, it will not zero the data out until you do.

ZERO OUT THE DATA QUEUE

There may be a situation arise when you load data via a Cognos Planning Analytics to Contributor link. Then, after you ran the link, you realized you didn't want to process the data all the way to Contributor. Don't worry. You can zero out the data queue. This will basically delete the staged date before you send it to Production. A couple important notes about this process. This should NOT be used in conjunction with the aforementioned Zero Data in a Contributor Model procedure. Reason being that it would result in deleting the import queue created by the previous prepare_import job and the data would not zero.

Steps to Zero Out the Data Queue

The first step is to go to Cognos Planning and then the Contributor Administration area. Then select the appropriate application as noted below. Next, go to the development folder and select import data area. Select the prepared data blocks tab and choose delete import queue. This will delete the staged or prepared data blocks. Also note that a Go To Production process is not required for this action. If you need to, you can fix the data in Cognos Planning Analyst and rerun the Analyst to Contributor link with the good data.

Cognos Planning

Hopefully this helps you understand how to zero out all of the data in Cognos Planning! My recommendation when you do this type of maintenance in Cognos Planning is to always FOCUS! Shut down your email and turn off your phone! I learned that lesson the hard way.

Some food for thought...

If you really want to simplify the maintenance of your budgeting and forecasting models, you should absolutely consider looking at TM1. At Lodestar Solutions, we feel the future of Cognos Planning is in question! Hence, we encourage you to start considering and defining your migration to a better tool. Check out our blog to learn the benefits of TM1 over Cognos Planning.

Lodestar Solutions is wanting you to be educated, so we are creating a library of videos that will help you evaluate TM1. Check out our Move to TM1 Program to get access to our library and don’t forget that there is NO COST TO YOU for this program.

If you have specific questions, please contact us at Coaching@lodestarsolutions.com and one of our analytics coaches will contact you.

If you learned something from this blog, we encourage you to refer back to the previous tips in this series....

Check out Tip #1 on what your future plans should be for Cognos Planning by CLICKING THIS LINK.

For Tip #2 on the questions most asked by Cognos Planning clients, learn more by CLICKING THIS LINK.

Refer back to Tip # 3 on simplifying D-List maintenance by CLICKING THIS LINK.

The previous Tip #4 regarding the re-ordering of your E-List can be found by CLICKING THIS LINK.

For Tip #5 on why your cubes may not be opening, CLICK THIS LINK to learn more.

Are Your IBM Security Certificates About To Expire?

10/4/2016

The purpose of SSL certificates (Secure Sockets Layer) are for encrypting data as it is transferred around a network. These SSL certificates also have an expiration date. If the IBM security certificates expire, you may run into issues using certain client side Cognos applications or gaining access to your data and system.

There are many good reasons for keeping your Cognos BI and TM1 systems up to date on fix packs. One of the most important is that it will keep the security (SSL) certificates updated or renewed! (Please note that the most recent fix packs do not have the SSL updates yet. IBM should have updated fix packs soon and we will update when available.)

Below, you will find some easy tips for determining when your certificates expire and how to check what version of TM1 or BI (with fix packs) you are operating.

Tips For Determining Expirations & Operating Versions

First, let’s start with the instructions to check on your expiration dates for the IBM security certificates. The summary instructions can be found by using this IBM Support Link.

Next, you will want to determine what version and fix packs of the Cognos software you are running by following this easy process. Keep in mind that you must be current on support in order to have access to fix packs. If you let this lapse or need assistance, please contact renewals@lodestarsolutions.com so that we may assist you.

Cognos BI

In regards to Cognos BI, you will need the information found using this IBM support link to compare to your settings.

Open Cognos configuration for BI, hit CTRL-F3, and the properties menu will come up. Click on the installed components tab and compare the numbers listed to the chart.

IBM Security Certificates

Cognos TM1

Cognos TM1 is very similar.  We suggest using this IBM support link to determine the version you are currently using.

Open Cognos configuration for TM1, hit CTRL-F3, and the properties menu will come up. Click on the installed components tab and compare the numbers listed with PLANANALYTICS to the chart.

IBM Security Certificates

Cognos Express

Finally, for determining Cognos Express, open Cognos configuration for Cognos Express, hit CTRL-F3, and the properties menu will come up. Click on the installed components tab and compare the numbers listed with the charts above for BI and TM1. While there is not a specific chart for Cognos Express, the numbers should correlate.

Furthermore, as a general rule for Cognos BI, TM1 or Express, we recommend you being on the most recent fix pack for the version you are running. Ideally, you want to be on the latest version and latest fix packs. For Cognos BI, TM1 and Express, that should be 10.2.2 unless you have upgraded to IBM Cognos Analytics (Cognos 11). The current version of Cognos 11 is 11.0.3 with release 4 coming soon.  Additional fix pack information can be found by visiting the following IBM Fix Central link.

Additional Fix Pack Information

There are also a few other things you can do besides making sure you are on the latest fix packs:

  1. Disable SSL in TM1s.cfg. This is not recommended because it will leave your traffic not protected by the SSL certificate.
  2. Generate your own IBM security certificate and then apply it.
  3. If you are using TM1 10.2.2, upgrade to a 2048 bit SSL as discussed on this IBM Support FAQ link.

Should you need assistance working through these instructions or any other questions regarding IBM security certificates, call Lodestar Solutions at 813-254-2040. If necessary, we will involve one of our technical consultants. We can also install the fix packs to ensure that your systems are up to date with the latest enhancements and updates!

UPDATE – IBM has announced the update fix for this issue (including some enhancements) will be release end of October or early November 2016.

Cognos TM1 Blogs

IBM COGNOS TM1 Blogs

Below is a  massive list of IBM Cognos TM1  blogs. To search for a specific topic, use the search bar to the right.

Introducing IBM Planning Analytics Core Financial Applications
Written by Mike Bernaiche, June 20th, 2024 IBM Planning Analytics is a powerful tool that helps organizations streamline their planning,[...]
Planning Analytics 2.1 Administration Training: A New Way to Administer Your Environment
Written by Mike Bernaiche, June 13th, 2024 With the announcement that Planning Analytics 2.0 will reach end of support in[...]
What’s New in Planning Analytics Workspace 2.1.3 (96)
Written by Mike Bernaiche, June 7th, 2024 This is it, one of the major updates I mentioned in my last[...]
End of Support for Planning Analytics 2.0.9
Written by Mike Bernaiche, May 23rd, 2024 There are some big changes coming soon to IBM Planning Analytics.  You can[...]
The Future of IBM Planning Analytics
Written by Mike Bernaiche, May 17th, 2024 Before we discuss the amazing future of IBM Planning Analytics we should start[...]
What’s New in Planning Analytics Workspace 95
Written by Mike Bernaiche, May 13th, 2024 Welcome to the cutting edge of business analytics! IBM’s Planning Analytics Workspace (PAW)[...]
What’s New in Planning Analytics Workspace 94
Written by Mike Bernaiche, April 4th, 2024 If you are using the AI forecast option in Planning Analytics, then you[...]
Beyond the ‘Office of No': Orchestrating Harmony Between Immediate Gains and Lasting Strategy
Written by Heather L. Cole, March 8th, 2024 In the dynamic landscape of modern business, achieving harmony between short-term financial[...]
What’s New in Planning Analytics Workspace 93
Written by, Mike Bernaiche, February 29th, 2024 In February 2024, IBM released the next version of Planning Analytics Workspace 93.[...]
IBM Planning Analytics as a Service on AWS
Written by Mike Bernaiche, January 18th, 2024 IBM recently introduced a new partnership and offering for Planning Analytics.  Planning Analytics[...]
Revolutionize Your Financial Stability with Better Planning Models
Written by Heather L. Cole, November 9th, 2023 As a financial executive, you are constantly facing the challenge of ensuring[...]
Breaking Down Barriers: Unleash Corporate Agility with Extended Planning and Analytics
Written by Heather L. Cole, November 2nd, 2023 In the ever-evolving landscape of business, the traditional Financial Planning and Analysis[...]
What’s New in Planning Analytics Workspace 90
Written by Mike Bernaiche, September 22nd, 2023 Planning Analytics Workspace 90 was released on September 19, 2023, and there are[...]
Escaping the Budgeting Blues: Rolling Forecasts with IBM Planning Analytics
Written by Heather L. Cole, September 1st 2023 Budgeting – the word alone can send shivers down the spines of[...]
What’s New in Planning Analytics Workspace 87
Written by Mike Bernaiche, June 28th, 2023 On May 17, 2023, IBM released Planning Analytics Workspace 87.  A reminder that[...]
The Power of IBM Planning Analytics for Mid-Market Companies: Leveling Up from Excel
Written by Heather L. Cole, June 14th, 2023 In the fast-paced world of mid-market companies, there's a hero waiting to[...]
What’s New in Planning Analytics Workspace 86
Written by Mike Bernaiche, May 12th, 2023 On April 18, 2023, IBM released Planning Analytics Workspace 86.  Continuing with their[...]
What’s New in Planning Analytics Workspace 85
Written by Mike Bernaiche, April 4th, 2023 In March 2023, IBM released Planning Analytics Workspace 85.  Updates to Planning Analytics[...]
What’s New in Planning Analytics Workspace 2.0.84
Written by Mike Bernaiche, February 16th 2023 It is time again to discuss what is new in Planning Analytics Workspace[...]
Docker Support – Every Planning Analytics Customer Must Read
Written by Mike Bernaiche, January 10th, 2023 If you own Planning Analytics and use Planning Analytics Workspace (PAw) on Microsoft[...]
What’s New in Planning Analytics Workspace 83
Written by Mike Bernaiche, January 26th 2023 It is a new year and IBM has released the first update to[...]
2023 Lodestar Solutions Services
Written by Mike Bernaiche, December 23rd, 2022 Lodestar Solutions Services team is ready to help in 2023!  I wanted to[...]
New in Planning Analytics Workspace 82
Written by Mike Bernaiche, December 16th, 2022 IBM released Planning Analytics Workspace 82 on November 17, 2022.  While this isn’t[...]
What’s New in Planning Analytics Workspace Version 80 and 81
Written by Mike Bernaiche, November 3rd 2022 It is that time to discuss what is new in Planning Analytics Workspace[...]
New in Planning Analytics Workspace 79
Written by Mike Bernaiche, September 15th 2022 On September 1, 2022, IBM released Planning Analytics Workspace 79 or 2.0.79.  There[...]
Coming Soon in Planning Analytics Applications and Plans
Written by Mike Bernaiche, July 29th 2022 Recently, I had the opportunity to see what was coming in Planning Analytics[...]
New in Planning Analytics Workspace Version 77 and 78
Written by Mike Bernaiche, July 28th 2022 It is that time again!  Providing you everything new in planning analytics workspace[...]
What’s New in Planning Analytics Workspace 75 and 76
Written by Mike Bernaiche, May 19th 2022 Typically, once a month, IBM updates Planning Analytics Workspace.  In April version 75[...]
What’s New in Planning Analytics Workspace 73 and 74
Written by Mike Bernaiche, April 7th 2022 Typically, once a month, IBM updates Planning Analytics Workspace.  In February version 73[...]
IBM Planning Analytics Versions Are End of Life
Written by Heather L. Cole, March 3rd 2022 Are you an IBM Planning Analytics TM1 user that has not upgraded[...]
Scenario Planning with IBM Planning Analytics
Written by Heather L. Cole, November 4th 2021Scenario Planning with IBM Planning Analytics Did the Pandemic disrupt your business?  Did[...]
What’s New in Planning Analytics Workspace Version 67 and 68
Written by Mike Bernaiche, October 21st 2021 Over the last couple of months, IBM has released 2 updates to Planning[...]
Forecast in Planning Analytics with AI
Written by Mike Bernaiche, October 14th 2021 Did you know that you can forecast in Planning Analytics Workspace by using[...]
Scorecards and Metrics in Planning Analytics
Written by Mike Bernaiche, September 30th, 2021Did you know that you can track your key performance indicators (KPI’s) with Scorecards[...]
What’s New in IBM Planning Analytics Workspace Version 64, 65 and 66
Written by Mike Bernaiche, August 19th 2021IBM continues to update Planning Analytics Workspace on a regular basis.  The pattern has[...]
How to Burst Reports from IBM Planning Analytics
Written by Heather L. Cole on August 5th, 2021 Are you an IBM Planning Analytics client and wondering how to[...]
How to Connect TM1 to Power BI
Written by Heather Cole, May 13th 2021Have you been asked to pass the TM1 data to Power BI, Tableau, Qlik[...]
Planning Analytics Workspace – Now What?
April 30th, 2021 Recently, it came to my attention that some Planning Analytics (formerly TM1) clients are not clear about[...]
What’s New in Planning Analytics Workspace Version 61 and 62
Written by Mike Bernaiche on April 23rd, 2021IBM continues to release updates every 30 to 60 days for Planning Analytics[...]
How to Easily Move Development Changes to Production in IBM Planning Analytics TM1 with Soterre
February 4th, 2021 TM1 Administrators of the IBM Planning Analytics that follow best practices should be using a development environment[...]
Version Control and Audit Trail for IBM Planning Analytics TM1 with Soterre
January 28th, 2021 Are you a TM1 Administrator and looking for a better way to know which Modeler changed a[...]
What TM1 Licenses Include IBM Planning Analytics Workspace?
January 7th, 2021 Looking for a better user interface to your powerful Cognos TM1/ Planning Analytics data?  Do your users[...]
Guide to Creating a Virtual Hierarchy
November 5th, 2020 In todays planning and forecasting world, the ability to change quickly and provide valuable data to your[...]
What Kind of Audit Trail is Available for Cognos Planning Analytics (TM1)?
October 29th, 2020 A budgeting and planning solution should always provide a good audit trail.  So, what kind of audit[...]
Sandboxes in Planning Analytics Workspace
October 1st, 2020 Think back to when you were younger playing in a sandbox.  Grabbing your plastic shovels and building[...]
Input and Spread Data in Planning Analytics Workspace
September 24th, 2020  Now that you have read the introduction to Planning Analytics Workspace (PAw) blog found here, it is time[...]
Benefits to Using IBM Planning Analytics Workspace
September 17th, 2020 If you are a TM1 or Planning Analytics client and you are not using Planning Analytics Workspace,[...]
Planning Analytics Excel Options
August 20th, 2020 Many of you use Excel when working with Planning Analytics.  You may be creating budgets, exporting files,[...]
Why Buy TM1 Connect?
July 31st, 2020 We are often asked; how can I bring my TM1 data into other software and create reports[...]
Why is TM1 Slow?
Are you a TM1 user and frustrated with the speed? Are your perspective reports extremely slow? Does opening a cube[...]
Creating the COVID Forecast with Cognos Planning Analytics, TM1
April 30th, 2020 ​Wow, the world has changed in just a few weeks!  Our way of life as we knew[...]
Cognos Planning Analytics for Excel
March 12, 2020Upgrading TM1, checkout Cognos Planning Analytics for Excel PAX Are you a long time TM1 user looking to[...]
Move Action Buttons from TM1 Perspectives to Planning Analytics for Excel (PAX)
March 5th, 2020Are you planning your upgrade from TM1 to Planning Analytics? After you review all the benefits, steps and[...]
IBM Let the Cognos TM1 Grandfather Clause Die Putting Old Clients at Risk!
February 20th, 2020​To all you very old TM1 clients I urge you to check your IBM entitlements!  I recently learned[...]
Neither Cognos BI nor Microsoft Power BI can Replace Cognos Planning Analytics (TM1)!
February 13th, 2020Recently I met with a client who owned and was using both IBM Cognos BI and TM1.  Their[...]
Is it Time to Redesign Your TM1 Planning System?
December 12th, 2019As business analytics coaches it is our passion to talk about latest innovations in the financial planning and[...]
Complete Model Development in Planning Analytics Workspace
November 11, 2019As business analytics coaches it is our passion to talk about latest innovations in the financial planning and[...]
Edit Dimensions and Others Greyed Out in Planning Analytics Architect
September 20, 2019The problem - Edit Dimensions and Others are GreyWe get this question from time to time - Why[...]
What does it cost to upgrade from TM1 to IBM Planning Analytics?
September 10, 2019“Hey, I heard a rumor that after September 2019 my TM1 10.2.x will not be supported. What does[...]
Model Directly in IBM Planning Analytics Workspace. A one stop shop
1/22/2019 Are you asking yourself if it is  possible to build a model directly in IBM Planning Analytics Workspace? There is[...]
7 Reasons Why You Should Purchase Workspace (PAw)
 9/11/2018 7 Reasons Why You Should Purchase Workspace (PAw)You have probably heard a lot about Planning Analytics Workspace (PAw).  However, like[...]
Dashboards in TM1? Legacy Users Need to Purchase Workspace Add-On.
7/10/18 Are you a Legacy TM1 shop and your users are asking about dashboards in TM1?  For all of you[...]
New Hierarchy Level Behavior in Planning Analytics (TM1) Set Editor
3/20/2018Leaf Element Level naming convention is really changing...It's when I first started teaching the Performance Modeler (PM) class and showing[...]
Have You Secured Funding Yet? It’s Budgeting Season, Secure Funding Now For IBM Cognos
Have You Secured Funding Yet?  It's Budget Season.  Secure Funding Now for IBM Cognos!​Do you need process improvement and require[...]
Lodestar Solutions is Hiring – TM1 Consultant Needed
5/11/2017TM1 Consultant OverviewAre you an expert in Cognos TM1? Do you have advanced knowledge of Microsoft Excel, Access or SQL[...]
TM1 User Maintained Public Subsets
3/6/2017 As a TM1 Admin have you ever wished for user maintained public subsets? User maintained public subsets puts the[...]
Recreating TM1 Alternate Hierarchies
2/8/2017 Cognos TM1 allows alternate hierarchies which, from a user perspective, is greatly appreciated but from a TM1 Admin perspective,[...]
TM1 Websheets to Simplify, Document and Organize TI Process and Tasks
12/22/2016 I’m all about easy. It is a phrase I say repeatedly to my clients that are new to IBM[...]
Are Your IBM Security Certificates About To Expire?
10/4/2016 The purpose of SSL certificates (Secure Sockets Layer) are for encrypting data as it is transferred around a network.[...]
How to: Deleting or Adding Dimensions to TM1 Cubes
6/1/2016 Today’s topic is about deleting or adding dimensions to TM1 cubes that you are using in Production. With TM1’s[...]
X