Multiple usage of functions to ensure “value” is not null?!

In this post I’m describing a piece of code that is not well written. The author (unknown) had applied three functions for a variable to ensure it never accepts NULLs. The code is given in the next screen shot.

ScreenShot 1. The query with three functions on OriginalAmount

As you can see the author had used three functions on the OriginalAmount column. I disagree for that approach of using twice the ISNULL function. Additionally the potential replacement of a NULL to 0 (instead of 0.0), forces an unnecessary implicit conversion. The replacement for the above query is the following

Query1. Query updated, displaces ISNULL functions and only SUM is used

In order to prove to the author that there is a sufficient appliance of the ISNULL functions, I’m making the next example demo with the data from the referenced tables. First executing the query for an example WalletID = 40689 to take a look on the data from the screen shot below.

ScreenShot2. The query data

Next I’m creating a temp table and populating it with the data for WalletID=40689.

Then I check if there are any zero values for the OriginalAmount column and make the following update just to ensure I have NULLs instead of zeros in the data set. I want to make the example closer to a potential real-case scenario when there would be multiple NULLs and for which the author had been potentially worried.

Running the same query from above produces the same result.

ScreenShot 3. The code-refactored query with the temp table.

In the Messages there is the expected warning for the elimination of the NULLs in aggregated functions.

Warning: Null value is eliminated by an aggregate or another SET operation.

The warning is not changing the results. The query can be re-written in a more elegant way by using only the SUM function.

Applying two additional functions for such a simple query and with consideration that the Transaction table is counting millions of rows and there can be thousands of transactions for a WalletID, is not a well designed code.

However, if of any reason there is a possibility for obtaining a NULL and to just ensure I really never have that NULL “value” for the @TotalDepost variable I can use the ISNULL function on it in the end and now it won’t touch the performances of the code.

Tweet about this on TwitterShare on LinkedInShare on FacebookShare on Google+

7 Replies to “Multiple usage of functions to ensure “value” is not null?!”

  1. First, I appreciate anyone that steps up to the plate with an article or blog post on SQL Server. Thanks for that.

    That being said, I believe you’ve come to an improper conclusion.

    The problem with the warning message of …
    “Warning: Null value is eliminated by an aggregate or another SET operation”
    … is that it’s an unexpected return that can cause many GUI’s to think it’s an error rather than an inconsequential warning. The message needs to be suppressed and that was the purpose of the original code. While I do agree with the principle of “DRY” code, the fact that the original author also wanted to guarantee that an overall null was converted to a 0 within the query rather than a separate query may also be inconsequential because ISNULL is very fast.

    The warning message MUST be suppressed if the call to the code is made by a GUI and the only way that you’ll be able to actually tell if the extra INSULL within the SELECT causes an issue with performance is to test it… which you didn’t do.

    You also missed the extreme optimization of test for WHERE Original_Amount > 0, which would negate any processing on 0 amounts but also negate the need for the ISNULL to prevent the warning because NULLs won’t fall into the comparison to begin with.

    • Jeff, thanks for your remarks!
      I, for sure, will take them additionally.
      The main reason for this post was the performance issue caused by ISNULL.
      Additionally the idea for that sql fragment of a stored procedure was to not touch much of the code but to see improvement by excluding ISNULL.
      I will see other remarks of you later.
      Thanks,

  2. Let me give you a little history on this. When we were looking at the behavior of aggregation in ANSI X3H2, we considered situations where having a null dropped out of an aggregate would be just fine, and situations where it would not be fine. The examples given (as best I remember after all these decades) was a report given the summary of sales of teams. If all the timbers have turned in their total sales, then the sum, average, etc. are good. However, if there is one slacker in the bunch who doesn’t turn his reports on time, we can’t do anything with it and we need to go over his cubicle and beat the hell out of them. The decision was to flag this is a warning and not terminate execution. This was proposed by interns at NIST and we put it in the in the standard.

  3. Isn’t querying financial transactions pretty much the textbook example of why indiscriminate application of the NOLOCK hint (i.e. “give me dirty data”) is a bad idea?

    • Could be said so.
      I was told that the query/sp is used for reporting purposes and that it isn’t a problem for them. NOLOCK is used for reducing locking, and the report is usually called with past date ranges.

  4. I suggest you go back and talk with those who told you NOLOCK isn’t a problem. NOLOCK can return duplicate values due to concurrent data movement, which makes that SUM non-deterministic. It the counter-claim is that the SUM is being performed on ‘old’ data (thus concurrent data movement is ‘not possible’), it appears the claimant is simplisically presupposing how SQL Server will be fetching OriginalAmount (i.e. the column being aggregated). I suggest NOLOCK be reconsidered with the query’s actual execution plans in hand, using realistic data that is predicated with various historically used date ranges. You may discover different execution plans have been used, and those differing plans reveal that presupposition has no basis in reality.

    I also suggest comparing the rewrites STATISTICS TIME and STATISTICS IO with the query as originally written :). The

Leave a Reply

Your email address will not be published. Required fields are marked *

*