If 0^0=1 then 0^0=0/0

Sort:
Avatar of Andrew_2600

If 0^0=1 and only 1, then it follows that

0^(0-1) shall equal (0^0)/0=1/0 and likewise

0^(0+1) shall equal (0^0)*0=1*0=0

Therefore (1/0)*0=0/0 and equals 0^0

If mathematics is a consistent science we must follow it's rules of operation.

Avatar of DavidNorman435
haha
Avatar of DavidNorman435
Although, 0-1 = -1, bit 0 in math
Avatar of DavidNorman435
So 0^(0-1) actually = 0^-1, which is actually 1/0, so that number isnt real, because anything divided by 0 isnt a real number
Avatar of Andrew_2600
DavidNorman435 wrote:
So 0^(0-1) actually = 0^-1, which is actually 1/0, so that number isnt real, because anything divided by 0 isnt a real number

 

Yes that is true as well. But my point being that 0^0 is not a real defined number either; as is 1.

Avatar of Andrew_2600

If n/n=n^(1-1)=n^0 why should there be an acception made for 0, just for "conveinience" of operations?

Avatar of DavidNorman435
Ok, so i see what you mean,
3/3 = 1,
And 3^0 = 1,
So 3/3 = 3^0

But in order to understand that, we mist know how anything to the power of 0 is equal to 1 before we can refute/defend anything on that subject, but i forgot the explanation.
Would you mind enlighting me?
Avatar of DavidNorman435
Must*
Avatar of DavidNorman435
I think it was something like n^0 = 0/n, so, in this case n=0 (the number of acception), then 0/0 =1 because anything divided by itself equals 1
Avatar of DavidNorman435
So, technically it is an acception, but it kind of makes sense because i guess, like you're saying, anything divided by 0 isnt a real number, so it kind of doesnt make sense at the same time
Avatar of DavidNorman435
And i think "acception" is spelled "exception"
Avatar of DavidNorman435
And also, 0 (speaking literally here) is the equivalent to nothing, and nothing (the thing, not the nothingness to it) normally gets the exception
Avatar of Andrew_2600

If we know that the 0th power of 4 is 1, because 4^(1-1)=(4*1)/4=1 and like wise 1*4=4. Then we know that 3 is the 0th power of 0 because 0^(1-1)=(0*3)/0=3

To say 0^0 equals just 1 can be proven to be false. Yet Google Calculator and schools across the world, insist it to be true.

Avatar of Andrew_2600

We know that infinity^0=any real positive number n, because n times infinity still equals infinity for any real that is positive.

Infinity^0=infinity^(1-1)=infinity/infinity equals any positive real n, because any real positive n times infinity is still infinity.

 

So why should there be an exceptional rule for 0? 

Answer, by the examples shown above there isn't. 

Avatar of 20gblum

0/0 and infinity^0 are both indeterminates. In other words, they do not have a defined value, which means their value can technically be any real number. That is calculus at its finest.

Avatar of Prometheus_Fuschs

0^0 is undefined....

Avatar of DavidNorman435
And nothing can equal anything, so technically, he's right
Avatar of m_connors

Reminds me of a "proof" from my high school days (too many years ago) that "proved 1=2. The sequence began with a=b and proceeded to add and multiple variables to both sides. Mathematically, all of the equations seemed to make sense. It was only upon closer inspection that one of the multiplicands was seen to be (a-b) and since the premise started with a=b, then a-b was zero (0).

Of course, anything multiplied by 0 is, well, 0. So at the midway of this "proof" we had 0 = 0. However, since the theorem was using the letters a and b, this was not readily noticeable so when it finally "proved out" we had something like 2a=b, and since a=b; 2a=a or 2=1.

PS. As I remember it, division by zero is "undefined", so cannot be done. (I said "as I remember it", we are going back nearly half a century!!)

 

Avatar of Prometheus_Fuschs
m_connors escribió:

Reminds me of a "proof" from my high school days (too many years ago) that "proved 1=2. The sequence began with a=b and proceeded to add and multiple variables to both sides. Mathematically, all of the equations seemed to make sense. It was only upon closer inspection that one of the multiplicands was seen to be (a-b) and since the premise started with a=b, then a-b was zero (0).

Of course, anything multiplied by 0 is, well, 0. So at the midway of this "proof" we had 0 = 0. However, since the theorem was using the letters a and b, this was not readily noticeable so when it finally "proved out" we had something like 2a=b, and since a=b; 2a=a or 2=1.

PS. As I remember it, division by zero is "undefined", so cannot be done. (I said "as I remember it", we are going back nearly half a century!!)

 

Dividing by zero is likely one of the worst nightmares in math students, I've certainly fallen for it!

Avatar of FlashyFerrari
m_connors wrote:

Reminds me of a "proof" from my high school days (too many years ago) that "proved 1=2. The sequence began with a=b and proceeded to add and multiple variables to both sides. Mathematically, all of the equations seemed to make sense. It was only upon closer inspection that one of the multiplicands was seen to be (a-b) and since the premise started with a=b, then a-b was zero (0).

Of course, anything multiplied by 0 is, well, 0. So at the midway of this "proof" we had 0 = 0. However, since the theorem was using the letters a and b, this was not readily noticeable so when it finally "proved out" we had something like 2a=b, and since a=b; 2a=a or 2=1.

PS. As I remember it, division by zero is "undefined", so cannot be done. (I said "as I remember it", we are going back nearly half a century!!)

 

Let a = b.

a² = ab

a² - b² = ab - b²

(a+b)(a-b) = b (a-b)

a+b = b

2b = b

2 = 1