Exploiting  Limit  Definition

.99.. vs 1 a Real Problem

Exploiting  Limit  Definition ( Non  Newtonian Calculus).  In this category of false proof of .99.. =1, the assumption is that the value of a function exists because either the right-hand or left-hand limit exists. This leads to assuming that the value of a function or series exists because one of the limits exists. Additionally, it is assumed that the limit of a function, sequence, or series represents the exact value of that function.

To put it simplify, some mathematical generalists claim that 1/10^n eventually becomes zero. This ignores the fact that adding 1/10^n 10^n times always results in 1. If it were zero, adding zero would always result in zero.

The history of calculus is important to understand, particularly how Newton addressed the concept of 0/0. The symbol of the integral, ∫, is derived from the Latin word "summa," indicating summation. Most people  know that the concept of integration comes from summation. Newton used Δx to calculate derivatives and invented calculus. People like Cauchy attempted to eliminate the need for Δx by introducing the concept of limits. However, they fall short to replace dx(Δx) or prove that it actually zero. Changing name of Δx or delta x  dx didn't solve the problem and just buried the infinitesimal for a century or so . We know that an integral is the sum of dx (Δx) * f'(x)=f(x). The point is that dx (Δx) is essentially an infinitesimal and proof that it cannot be zero. The limit is the heart of calculus and is not well-defined( or poorly defined ) , even though it contradict the notion of Δx being zero. If dx (Δx) is zero, then all integrals would be zero( sum of f'(x)*0=0, which is not the case. https://www.youtube.com/shorts/In2msBtAZto.

We understand that assuming certain things can simplify our lives, like thinking the Earth is flat when we go shopping, play basketball, or drive to another city using a map. However, ease of use isn't proof that the Earth is flat; similarly, assuming delta x is zero in calculus makes calculations easier and seems logical until proven otherwise. 

 Most people today think "rigor" means correct and proven facts. Rigor actually refers to a set of carefully examined and  accepted rules, which are not necessarily correct but are assumed to be true. Occasionally, if these rules are found to be incorrect or contradictory, they are either discarded or revised. So, when we refer to the "rigorous definition" of calculus, it simply means we have decided to accept it as correct. That does not constitute a proof, nor does it guarantee correctness.


Above show the misconception regarding limit definition at the time of publishing The Naked Emperor playlist. We can see the misconception lead to claiming .99... = 1 which start with simply claim 1/3 = 0.33..., and as we go step by step forward, we discover the error and find a better and more apparat logically rigorous proof. However, we end up leaving a trail of misunderstanding for others. This means that for superficial observers who do not delve deeply, the erroneous repetitive false proofs are enough to confuse and accept, triggering a false positive feedback loop similar to the story of 'The Emperor's New Clothes,' hence the name 'The Naked Emperor.'

Server IP: 10.70.0.254

Request IP: 34.204.176.71