Saturday, 11 February 2017

MS Excel - How to get rid of the GetPivotData function in formulas?


Have you tried copying a formula involving cells within a pivot table? You may find that the the value in the target cells don't change. You may also notice that the formula doesn't refrerence the cells in the usual way like A1, C2, etc. Instead, it uses the GetPivotData function. for example when I tried creating a formula adding J4 and I4 and that both J4 and I4 within a Pivot Table. But the formula you see in the target cell is something like the one below:

=GETPIVOTDATA("Sum of Dec'16"|$A$3|"Group"|"Administrative Expenses")+GETPIVOTDATA("Sum of Nov'16"|$A$3|"Group"|"Administrative Expenses")

As copying this formula to the adjoining cell does not change the cell reference and as such the value doesn't change. How do we workaround this issue?


I am sure while creating the formula, you would have used the mouse to select the cells referenced in the formula. Try just typing the cell references in the formula without using the mouse. For instance, just type '=J4+I4' and there you go, the formula remains as it is and at this instance you don't see the reference to the GetPivotData function. Now you copy this formula to the adjoining cells and it works as usual and no issues. So the issue is when you select the cells referenced in the formula using the mouse / touch pad.

Now why Excel behaves like this, we don't know. However, if you don't like this behavior and permanently disable this you need to do this. It's simple.

Bring up the Excel Options by clicking on File --> Options menu. May be, there are different ways of reaching out to the Excel Options in different versions of Excel. Under the Formulas Tab, you will find a check box 'Use GetPivotData functions for Pivot Table References' under Working with formulas section. Given below is a screen shot of Excel 2016.

There you go, just uncheck this checkbox and Excel won't use the GetPivotData function any more.

Saturday, 21 January 2017

Bootstrap Carousel - Different intervals for differnet slides


Bootstrap Carousel is an easy and convenient option for implementing a slider on your website. But is there a way to advance the slides with different time intervals. For example, slide 1 will advance after 10 seconds, the second slide after 5 seconds and slide 3 to advance after 3 seconds. I came across a scirpting suggestion using the the carousel method with the 'pause' parameter. i.e. Pausing the slide advancing for amount of time computed as the slide timer as reduced by a the interval. But for some reasons this solution wasn't working for me. But I could have it working with a different approach, which is explained below, so that this will be useful for others as well.


Instead of using the pause attribute, I tried using a script to advance the slides. i.e. advance the slides manually by calling carousel method the 'next' attribute. This can be accomplished in combination with the setTimeOut function. For e-g, something like this should help in having this working.

    var t;
    var start = $('#myCarousel').find('.active').attr('data-interval');

    t = setTimeout(function () { $('#myCarousel').carousel('next') }, start);

The block above will ensure that the first slide advances after the interval set for the first slide. Subsequent slide advances can be hanlded using the slid event as below.

    $('#myCarousel').on('', function () {
        var duration = $('#myCarousel').find('.active').attr('data-interval');

        t = setTimeout("$('#myCarousel').carousel('next');", duration);

Hope this helps. Please to note that to have this working, use the data-interval attribute and set the desired interval in milli-seconds for each slide. e-g.

    <div class="item" data-interval="4000">

I have found this working fine with a single Carousel in the same page and have not tested this with multiple Carousels in the same page.

Saturday, 10 September 2016

ASP.NET - Setting HTML Meta tags in content pages


Though the @Page directive in a content page allows one to specify the Description and Keywords attrubutes, the same is ignored when the page is rendered. Instead the Description and Keywords as specified in the master page is what gets rendered as part of the final html output. Is this the intended design and is there a solution to work around this issue?


By design, the HTML Meta tags specified in the content page is ignored when the page is rendered. In this context it is important to understand the following:

  • The Master page contains the <head> tag of the page and not the content page. As such the meta tags specified in the master page will prevail.
  • The Content page derives the System.Web.UI.Page class, which though recognizes the Title attribute provided as part of the @Page directive, it does not recognize the other meta tags like description and keywords.
  • The master page and content page are dendered in teh following order:
    • Content Page PreInit event
    • Master Page Init event
    • Content Page Init event
    • Content Page Load event
    • Master Page Load event

As you may observe, while the Load event of Master page happens after that of the Content Page, the Init event of the Master page happens ahead that of the content page.  Given that the Master page loads after the content page, you can manage to use the Title attribute specified in the content page using the script tag within the <title> element, as below:

<title><%: Page.Title %></title>

You cannot however handle the Meta tags in the same way. One solution to handle the Meta tags specified in the Content page is to use a custom Page class, which extends the System.Web.UI.Page class, wherein add support to handling the Description and Keywords as input in the @Page directive. This can be accomplished by adding appropriately overriding the OnLoadComplete event of the Page class, wherein the needed Meta tags are constructed using the values specified in the @Page directive and the same are added to the Page Header.

Check out this codepage link for a sample solution.

Saturday, 9 July 2016

SSRS - Custom Period Filters


Just wondering if the reports published on SQL Server Reporting Services can be customized to have a predefined period filters like Last Week, Last Month, Last Quarter, etc.

By: Anonymous


You have few options to work around this, which are given below:

1. Dynamic Parameter

Basically, your underlying query or stored procedure needs the values for the From and To paramteres to apply in the where clause of the query. You can do this by first creating a mutli valued Period parameter, with values like, Last Week, Last Month, etc specified as available values. The From and To parameters can then be set up in such a way that it fetches the default value from a query, which takes the value from the Period parameter and then return the corresponding From and To values respectively. In these, the Period Parameter shall be positioned above the From and To parameters. The query can be as simple as


for the default values of the From parameter. You may extend this idea and use it different ways to make a parameter dynamically derive a value based on a previous parameter. Note that the order of the parameter is important as the parameters are constructed sequentially and there is nothing like an onChange event that you can think of.

2. Let the SQL Query / Procedure do the work

The second alternative is to put in a logic similar to the above within the Report Query or Stored Procedure, so that the user just selects the period filter and the SQL Server interprets it appropriately and returns the appropriate results. The only difference with this option is that the From and To Filters need not be designed in the report and the user won't see it on screen.

3. Wrap the report within an Application

Just in case, you are rendering the reports within an Application, like a ASP.NET application or a Windows Forms Application, then you can have the complete control on the Filters in the presentation layer of your application. For instance in this case, you accept a Period Filter from the user in your front end application and then transform it to an appropriate From and To dates to the ReportViewer component and get the report rendered accordingly.

Sunday, 3 July 2016

Issues in Signing a .NET Assembly using .pfx file


I have procured a code signing certificate with a view to ensure that one of application that we intend to distribute to external users is signed so that the same can be trusted by the end users. I imported the certificate into the certificate store and used it successfully to sign the Click Once Manifest using the Signing tab in the Projec Properties dialog in Visual Studio 2013. But I could not have the assembly signed. I exported the certificate from the store as a .pfx file and tried using it for signing the assembly, but am getting issues like "Private Key not Found", while the Private key is very much present in the .pfx file.

By: Rajkumar David


Visual Studio 2013 has known issues in handling PKCS# 12 certificate files as it cannot handle files with multilpe certificates in the CA certificate chain. Visual studio may not still consider such certificates for signing the assembly because of the KeySpec Parameter, which is usually set as AT_KEYEXCHANGE(1), whereas Visual Studio expects this to be AT_SIGNATURE(2). It is possible that while requesting the certificate, the KeySpec is set as 1 and as such the certificate is generated with the value as 1. You may verify CSR that you have submitted to the CA to check this.

OK, now what do you with that certificate? You have option to import such certificates with KeySpec set as AT_SIGNATUR. The Windows Servers from 2003 onwards have a commandline certificate import utility - certutil.exe. This command allows you to import the certificate with the right KeySpec parameter. Use the following command to do this:

certutil -importPFX -user <pfxfilename> AT_SIGNATURE

After importing the certificate using the above command into the certificate store and then export it back as .pfx file from the store for assembly signing within Visual Studio. This should resolve the given problem.

More on the CertUtil command can be found here.

Sunday, 29 November 2015

How to Create '|' (Pipe) Delimited Files in Excel


Microsoft Excel is a very useful tool for data analysis. It supports import of data from various sources and formats into it for analysis. But, though it supports export of data in various common formats, it does not natively support export of data in a delimited text format with the delimiter other than Comma and Tab characters. This post explains how to get the excel data exported into a "|" (pipe) delimited text file.


Excel does not directly support export or saving the data delimited with a pipe or such other characters. It supports comma delimited files (.csv) or tab delimited files though. The csv file generator however uses the List Separator as set in the windows Regional Settings as the delimiter. With this you can change this value in the regional settings and the csv file option will now produce a delimited file with the delimiter of your choice as set in the regional settings. For those not familiar with the regional settings, here is how to get this accomplished:

If you are using Windows 8 or 8.1, you will find the Regional Settings option under the "Clock, Language and Region"  category.

Click on the Additional Settings button in the Formats tab of the Regional Settings dialogue box. In the Numbers tab of the resulting dialogue box, you will find the field List Separator (Highlighted in the image. By default, it displays ","(comma). Now set it to a character that you need the files to be delimited with. For instance, if you want export a pipe delimited text file from excel, enter the charcter "|" in this field and apply the change.

You are done. Close and re-open Excel with the data that you want to export. Now use the Save As option to save the sheet as a csv file (.csv). This will now produce a text file delimited with the pipe character. Unless you regularly use pipe as the delimiter, you may want to set it back to comma, so that it does not impact any other operations that dependent on the List Separator field.

Sunday, 20 September 2015

Powershell Script for Cleaning up Old Files


I am working on a project that creates 100s of files every hour which has led to creation of over few thousand files in a week. Due to the very high number of files, Explorer takes so long to list the files in the folder. As this could fill up the storage pretty fast, I need to delete files older than 7 days. Please suggest me how best to accomplish this, preferably using PowerShell and leveraging the Scheduled Tasks.

By: Anonymous


Powershell is the way to go for these kind of tasks. For deleting old files, the following script will do the job:

Get-ChildItem –Path <Base Folder> –Recurse | Where-Object CreationTime –lt (Get-Date).AddDays(-7) | Remove-Item

Replace the <Base Folder> with the actual base folder  that need to be looked up for this task. Simiarly, you may observe the from the above script, this will delete the files based on the time of creation of the file. If you want to use the last modified time, replace the CreationTime with LastWriteTime

The above script can be executed from the PowerShell prompt or can be scheduled to run automatically using the Windows Task Scheduler. There are couple of things that you should take care of while scheduling:

  • By default, the deletion will fail as the execution policy may not permit the deletion. To override this set the parameter ExecutionPolicy to bypass
  • You may want to add the parameters - noninteractive. to enable to run automatically without needing user action. 
  • Do not place the script itself in the program field, Instead save the script as a .ps1 file and pass the script file with fully qualified path as a parameter in the parameter field.

Typically, the following should be added to the parameter field:

-noninteractive –nologo -ExecutionPolicy Bypass -command "& 'c:\datafiles\cleanup.ps1'"

Needless to mention that the progrm / script field shall contain the PowerShell executable with its fully qualified path, which will be like this:

You may also want to have a look at the clean up script on the TechNet site, which has many more capabilities, like crating a log of files deleted and so on.