In the following example, I'm returning a DateTimeOffset? using the default value

var a = ConvertToDateTimeOffsetA(null); // 1/1/0001 12:00:00 AM +00:00
var b = ConvertToDateTimeOffsetB(null); // null

private static DateTimeOffset? ConvertToDateTimeOffsetA(DateTime? date)
{
	return date != null
		? new DateTimeOffset(date.Value, TimeSpan.Zero)
		: default;
}
private static DateTimeOffset? ConvertToDateTimeOffsetB(DateTime? date)
{
	if (date != null) 
		return new DateTimeOffset(date.Value, TimeSpan.Zero);
	return default;
}

Why is there a difference between the returned output in the ternary compared to the if statement?

My guess would be the ternary first coerces the type to DateTimeOffset and then inline converts back to Nullable<DateTimeOffset>, but I'm not quite sure why?

In the ternary version, it is interpreting your default as default(DateTimeOffset), the other output expression type in the conditional. Then it is interpreting the ternary as a whole as a nullable, which will never be null.

In the second case, your return is using default(DateTimeOffset?), from the declared return type.

In this case, you may want to use an explicit type in the default expression, or: just use the second form and add a comment (and ideally a unit test), so nobody "fixes" it in the future.