int? foo = null;So what's wrong with this picture?
foo += 123;
Well, if you apply arithmetic operations to a null value, you get a null value in return.
So instead of foo being 123, the value of foo is null.
It's easy to spot in the above example the fact that foo is null. But when you have lots of lines of code, it's easy to lose track of the fact that foo is a nullable type or that it could be null at the point in time you wish to perform calculations on it.
The correct way to perform that calculation is like so:
int? foo = null;
foo = foo.GetValueOrDefault() + 123;
So if you're working with nullable types you gotta keep in mind that null + 1 = null.
Problem with the code at the top is that it compiles fine.
You never know you made the error until you find your values are turning out as null.
Which in some cases will take you a few days to even find out something's not working right.
So always check your variable types. Hover the mouse over or hit period after the variable to check what pops out.
1 + 1 = 10 ;)
1 + null = null :O