How is 0! 1?
For natural numbers (1,2,3…) it is defined as n*(n-1)*(n-2)*…*1
Ik
1!=1, so 1/1=1
But factorial means _ multiplied by everything lower than _ (except for negatives)
Ig 0 breaks the rules tbh
1: instead of using underscores, using a variable works and is clearer
2. That definition is wrong because it includes 0. You mean multiplied by all the natural numbers less than it
If you need help, please contact our Help and Support team.