• Leate_Wonceslace@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    2 days ago

    Any my argument is that 3 ≠ 0.333…

    After reading this, I have decided that I am no longer going to provide a formal proof for my other point, because odds are that you wouldn’t understand it and I’m now reasonably confident that anyone who would already understands the fact the proof would’ve supported.

    • Tlaloc_Temporal@lemmy.ca
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      2 days ago

      Ah, typo. 1/3 ≠ 0.333…

      It is my opinion that repeating decimals cannot properly represent the values we use them for, and I would rather avoid them entirely (kinda like the meme).

      Besides, I have never disagreed with the math, just that we go about correcting people poorly. I have used some basic mathematical arguments to try and intimate how basic arithmetic is a limited system, but this has always been about solving the systemic problem of people getting caught by 0.999… = 1. Math proofs won’t add to this conversation, and I think are part of the issue.

      Is it possible to have a coversation about math without either fully agreeing or calling the other stupid? Must every argument about even the topic be backed up with proof (a sociological one in this case)? Or did you just want to feel superior?

      • Leate_Wonceslace@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        2 days ago

        It is my opinion that repeating decimals cannot

        Your opinion is incorrect as a question of definition.

        I have never disagreed with the math

        You had in the previous paragraph.

        Is it possible to have a coversation about math without either fully agreeing or calling the other stupid?

        Yes, however the problem is that you are speaking on matters that you are clearly ignorant. This isn’t a question of different axioms where we can show clearly how two models are incompatible but resolve that both are correct in their own contexts; this is a case where you are entirely, irredeemably wrong, and are simply refusing to correct yourself. I am an algebraist understanding how two systems differ and compare is my specialty. We know that infinite decimals are capable of representing real numbers because we do so all the time. There. You’re wrong and I’ve shown it via proof by demonstration. QED.

        They are just symbols we use to represent abstract concepts; the same way I can inscribe a “1” to represent 3-2={ {} } I can inscribe “.9~” to do the same. The fact that our convention is occasionally confusing is irrelevant to the question; we could have a system whereby each number gets its own unique glyph when it’s used and it’d still be a valid way to communicate the ideas. The level of weirdness you can do and still have a valid notational convention goes so far beyond the meager oddities you’ve been hung up on here. Don’t believe me? Look up lambda calculus.