It seems like when one is implementing Null Objects and making incomplete attempts to prevent values from ever being null it is not really an argument in favor of nullable values being good and the only type of values that should be present in a language.
"All non-primitive values are nullable references" is a feature of the language. You posted that you are trying to avoid using that feature (instead using non-null references to a special Null value?) and trying to avoid having null references for the types you create. It seems like you do not actually think the feature is a good feature.
Can you explain why it is good that null is a valid value for your types at the same time as you are trying to prevent them from ever containing that value?
If the semantics were the same, you wouldn't need a replacement and all your proposed changes would have no observable effect on the behavior of any program.