A Proof for the Twin Primes Conjecture

แชร์
ฝัง
  • เผยแพร่เมื่อ 20 ม.ค. 2025

ความคิดเห็น • 38

  • @twilightstar7781
    @twilightstar7781  3 หลายเดือนก่อน +2

    Hello! I wanted to make this a relatively simple presentation, otherwise I never would have finished this. I might make an improved version in the future, but for now I hope it suffices :)
    I admit that I am nervous posting this; I worry that after all my work, I have made a mistake somewhere. Nevertheless, I am at a point where I am as confident as I can be that I have gotten it right, and if not, then perhaps someone else can pick up where I left off.

    • @taylorschaerer9854
      @taylorschaerer9854 3 หลายเดือนก่อน +2

      I am not one who can prove or dismiss what you have here, but when you think you have something you have to share it. There's no shame if this turns out to be wrong. Sharing this was absolutely the right thing to do, so no need to feel nervous about it. I hope smarter people than I can verify this for you. Very exciting and best of luck!

    • @twilightstar7781
      @twilightstar7781  3 หลายเดือนก่อน +1

      @@taylorschaerer9854 Thank you! While I sorta know that deep down, it is always nice to hear from someone else :)

  • @philippwei3352
    @philippwei3352 3 หลายเดือนก่อน +12

    Hi, I read your paper. The first gap that I spotted, (which might be fixable) is that in the end, just because a sequence (in this case your lower bounds) is increasing, doesn't mean it goes to infinity, e.g. 1/2, 3/4, 7/8, ... is increasing but stays bounded by one.
    The more likely non-fixable gap seems to be in your claimed bounds of H_n(x)+-n for the number of valid rows below x. The heuristic H_n(x) only becomes the true number for x being the product of the first n prime hexas, which is a huge number compared to your error term n, so that seemed wrong to me. I didn't understand the "proof" since you never actually explained what exactly your IN function is supposed to count, but I wrote a python script to try to find a counterexample for the bounds. Here are the first few. coprime_count counts the number of valid rows:
    --------------
    Failed for n = 5 and x = 134 . coprime_count = 30 and H(n) * x = 35.08080155138979
    Test failed for n = 5
    Failed for n = 12 and x = 1809 . coprime_count = 277 and H(n) * x = 289.1043308634691
    Test failed for n = 12
    Failed for n = 14 and x = 2507 . coprime_count = 355 and H(n) * x = 369.12998667052705
    Test failed for n = 14
    Failed for n = 15 and x = 2264 . coprime_count = 307 and H(n) * x = 322.050708996898
    Test failed for n = 15
    --------------
    And below you can find my script for you to check in case I did something wrong.
    Cheers,
    Philipp
    --------------
    def generate_primes_up_to(n):
    sieve = [True] * (n + 1)
    sieve[0] = sieve[1] = False
    for start in range(2, int(n**0.5) + 1):
    if sieve[start]:
    for multiple in range(start * start, n + 1, start):
    sieve[multiple] = False
    return [num for num in range(n + 1) if sieve[num]]
    primes = generate_primes_up_to(1000)[2:]
    # print(primes)
    def H(n):
    p = 1.0
    for i in range(0, n):
    p *= (primes[i] - 2) / primes[i]
    return p
    def coprime(y, n):
    for i in range(0, n):
    referent = (primes[i] + 1) // 6
    if y % primes[i] == primes[i] - referent or y % primes[i] == referent:
    return False
    return True
    def test(n):
    H_n = H(n)
    coprime_count = 0
    for x in range(1, primes[n - 1] ** 2):
    if coprime(x, n):
    coprime_count += 1
    # print("x =", x, "is coprime with {5, ...,", primes[n - 1], "}")
    if coprime_count < H_n * x - n: # or coprime_count > H_n * x + n:
    print(
    "Failed for n = ",
    n,
    " and x = ",
    x,
    ". coprime_count = ",
    coprime_count,
    " and H(n) * x = ",
    H_n * x,
    )
    return False
    return True
    for i in range(len(primes)):
    if not test(i + 1):
    print("Test failed for n = ", i + 1)
    # break

    • @twilightstar7781
      @twilightstar7781  3 หลายเดือนก่อน

      For the first problem, that is something to address. I will have to think more on that.
      For the second,The IN() function counts the instances of either one particular hexa, or one pair of corresponding instances in the entire hexorial cycle. These two versions are only used to give the bounds on the approximation. Using it to count instances of hexas gives the error as being no more than n, and using it to count instances over the whole cycle ensures that that remains the case after we remove the redundant instances.
      You are right that essentially the only point where H_n(x) is _exactly_ the correct value for valid referents is at the end of the hexorial cycle. But that isn't what we're using; in the paper, I show that at ANY x, the correct number of valid referents can be no more than n away from H_n(x).
      I will be honest, I don't know what you were doing with the Python code. But perhaps an example would help. Let's go with the case where n = 3. When I used n at this point in the paper, it is the number of prime hexas under consideration, so in this case we are considering the first 3 prime hexas, which will be 5, 7, and 11. The critical area is ((11^2) - 1) / 6 = 20. Our approximation is the lesser over the greater hexorials, so in this case it is
      (3 * 5 * 9 ) / (5 * 7 * 11)(x) = (0.35064935064) * x.
      We plug in the critical area and get (0.35064935064) * (20) ~7.013. So now we know that the ACTUAL number of valid referents less than 20 is no more than 7.013 + 3 and no less than 7.013 - 3, or between 10.013 and 4.013. On the chart in the paper (the second one, after it has been converted to referents on the vertical axis instead of anchors), you can see there are 7 rows where the first 3 cells are uncolored, i.e. 7 valid referents, those being r = 3, 5, 7, 10, 12, 17, and 18. So indeed, it is well within our predicted range.
      It doesn't matter that this fails to give us the exact value, all that matters is that we have a lower bound, which we can then show to increase. As noted earlier, I need to put in a bit more work to show that this increasing approximation does not approach some value, but that's for another time.

    • @philippwei3352
      @philippwei3352 3 หลายเดือนก่อน +1

      ​@@twilightstar7781 "IN() function counts the instances of ... one particular hexa." I don't get that, can you use IN's parameters in this sentence? What are "instances" here?
      Wrt. the second half: Ok, I thought you are claiming H_n*x - n H_n * crit_area_end + n
      ):
      print(
      "Failed for n = ",
      n,
      " and x = ",
      crit_area_end,
      ". coprime_count = ",
      coprime_count,
      " and H(n) * x = ",
      H_n * x,
      )
      return False
      return True
      for i in range(len(primes)):
      if not test(i + 1):
      print("Test failed for n = ", i + 1)
      # break

    • @twilightstar7781
      @twilightstar7781  3 หลายเดือนก่อน +1

      @@philippwei3352
      - An instance of a hexa is a referent which is invalid WRT that hexa. So for example, the cell (4, 5), that is, row 4 and column 5, is colored in, so the referent 4 has an instance of the hexa 5. 4 is thus an instance of 5, but it is NOT an instance of 7, because it is valid WRT 7.
      For an example with parameters: IN(x, 1, 5) only increases when x, interpreted as a referent, is invalid with respect to 5, i.e. it is flat everywhere except at those values where x is congruent to +-1 mod 5. What this means is that 6x + 1 or 6x - 1 is divisible by 5, and therefore cannot be prime.
      - RE: "I thought you are claiming H_n*x - n

    • @philippwei3352
      @philippwei3352 2 หลายเดือนก่อน

      @@twilightstar7781 Thanks your response. Could you please write a clear definition that starts like this? "IN(m, d, x) counts ...". And then actually use m, d and x in that sentence. I'm still unsure how x plays into this, and I think you just switched the order of your parameters in your last answer? This makes it even more time consuming to check all plausible interpretations.
      " I don't know if it is including 2 and 3."
      It does not, the line "primes = generate_primes_up_to(1000)[2:]" has this [2:] in the end, which means "Take the list starting at the element 2, (skipping element 0 and 1), up until the end of the list". So I'm skipping the first two primes since they are not hexas.
      I figured out a way for you to double-check this without believing my code: Within the critical area, invalid rows are just twin primes, right? Except for the trivial invalids, like 2 being invalid wrt 11, because 6*2-1 is divisible by 11. To my understanding, your IN function also counts those "small rows" as invalid, but if we include them, the number of valid rows is exactly the number of twin primes, and your lower bound is still too large, starting at n=98.
      Failed for n = 98 and x = 48780 . coprime_count = 2941 and H(n) * x = 3044.5270994316384
      Test failed for n = 98
      Failed for n = 99 and x = 49868 . coprime_count = 2986 and H(n) * x = 3101.0529002722756
      Test failed for n = 99
      Failed for n = 100 and x = 51708 . coprime_count = 3085 and H(n) * x = 3203.9280313614927
      Test failed for n = 100
      If you take n=98 primes, the 98th hexa is the 100th prime, which is 541. 541*541-1 is 292680, so x=48780. H(n)*x=3044.5..., that should be checkable with Excel. And for the number of valid rows, just ask ChatGPT how many twin primes there are below 292680. For me it produced this very simple python code (see below), along with the answer 2942. This matches my 2941, since I'm not counting (3,5). In any case, this is less than your bound of 3044.5-98=2946.5.
      --------
      import sympy as sp
      # Find all prime numbers less than 292680
      limit_new = 292680
      primes_new = list(sp.primerange(1, limit_new))
      # Find all twin primes (p, p+2)
      twin_primes_new = [(p, p + 2) for p in primes_new if sp.isprime(p + 2)]
      # Get the count of twin primes
      twin_prime_count_new = len(twin_primes_new)
      twin_prime_count_new
      --------
      I now also have a guess where your proof went wrong: When you count invalid rows, you are counting invalid instances and subtract some stuff "for each redundancy". You never quite define what a redundancy is, but you talk about "one pair of corresponding instances" in your first reply to me. I think you are miscounting rows with three or more invalid instances. A row of three invalid instances has 3 pairs of instances, not just the 2 that you need to subtract. You would need to add triples of instances, and then subtract quadruples... This is known as the "Inclusion-exclusion principle", (see Wikipedia) and I don't see you doing anything like that anywhere.

    • @twilightstar7781
      @twilightstar7781  2 หลายเดือนก่อน

      @@philippwei3352 Sorry for the confusion, I admit I didn't check the order of arguments, and that was careless of me. IN(m, d, x) counts the number of positive integers less than or equal to x which are congruent to d or -d mod m. For example, IN(5, 1, 6) = 3 because there are 3 positive integers less than or equal to 6 which are congruent to either 1 or -1 mod 5, those being 1, 4, and 6.
      - "Within the critical area, invalid rows are just twin primes, right?"
      It is the VALID rows that correspond to twin primes. I suspect that this was a typo based on the rest of that paragraph, but I thought I should clarify just to be sure.
      I must admit it feels suspicious that this allegedly breaks down at the 100th prime, with no discernable cause. That feels like an awfully convenient place for that to happen, honestly. I still suspect it's an error in the code rather than the theory, but I was never very good at diagnosing code even when I was studying it, so I can't be sure. Could you give the results for smaller values of n so that we can check that it is doing these calculations correctly?
      - "You never quite define what a redundancy is"
      I guess I didn't explicitly write it out, but I did say this:
      "This sum counts all instances of the hexas being considered, but there are some referents which are invalid with respect to more than one hexa. In order to count only the invalid referents [as opposed to all invalid instances], we will need to subtract them [the extra instances] out relative to the whole hexorial cycle [...] where k is the number of redundant instances, and lj is the jth such redundancy."
      I figured this made it clear enough what I meant be "redundancy", by I guess not.
      - "I think you are miscounting rows with three or more invalid instances."
      The process you describe in this section is exactly the process I was describing in the paper. The example I used only had two hexas under consideration for simplicity, but in a case such as you describe, you would indeed subtract out multiple copies of the function. I included this under the broad header of "redundancy".
      For example, if we were to look at 5, 7, and 11, x = 64 should be invalid with respect to all three of them, because 6(64) + 1 = (5 * 7 * 11). This essentially means that in the basic approximation with redundancies, there are three copies of IN(385, 64, x) that are added together, one for each of the three hexas. In order to remove them correctly, you would have to subtract two of them so that you're only left with one.

  • @Monkala2
    @Monkala2 3 หลายเดือนก่อน +17

    Your ideas are fun and creative, but ultimately they do just amount to a restatement of the sieve of Eratosthenes and the heuristic argument.
    Your arguments about the "critical area" are probabilistic and can be boiled down to "since there are infinite primes of the form 6k + 1 and 6k - 1, there must exist infinite k's such that 6k + 1 and 6k - 1 are prime." In fact, that's what's at the heart of what makes this problem so compelling to many mathematicians, that it appears so obvious and yet is so dubious when trying to prove rigorously. Don't take this video down, and keep exploring math! It's a good record of your journey and how far you will come.
    Remember that when proving something, you have to show a condition definitively. It doesn't suffice to just state that "this pattern looks true and it makes sense to me so it must be true," you need to adhere specifically to the formal principles of mathematical proofs. If there is even a single permutation where, even with a narrowed critical area, that the hexas end up being the exact primes / nonprimes ad infinitum that would prevent twin primes from appearing, then the proof is not considered rigorous, and so heuristics aren't sufficient to validate the claim.

    • @twilightstar7781
      @twilightstar7781  3 หลายเดือนก่อน +1

      I don't really follow your reasoning that my argument boils down to "the heuristic argument". It is heuristic, but it is not the argument you seem to be describing. My method does not vaguely gesture at the infinitude at play, but instead gives a specific lower bound on how many twin primes exist in a critical area, and shows that that bound increases as you look at larger critical areas. Nor is it a probabilistic argument; it just turns out that using the probability as an approximation works out pretty well. It's not "I expect that about this proportion of referents in the critical area are valid because that's the probability", it's "I can use the probability as an approximation because I know that the true value is pretty close to that probability".
      If this is a genuine flaw, I would like to understand it better, and I think I would benefit from a more detailed description of how you understand my argument and where it goes wrong.

    • @Monkala2
      @Monkala2 3 หลายเดือนก่อน +1

      @@twilightstar7781 Well you're stating patterns that rely on direct observation.
      Can you demonstrate that, without bound, that the patterns you see will hold? From what I've noticed, it looks as if all of your methods---your lower bounds, your invalid anchors, etc---only represent candidates for twin primes, not a guarantee that there will exist one, especially as the prime numbers continue to become sparser as magnitude increases.
      For example, I've found a very clear and distinct pattern with implications for a problem I'm working on and have verified its exact calculability up to n = 4*10^17, though am still working on a rigorous proof as to why this connection exists.
      If you can find a rigorous connection between your observations of these patterns which are so clear and undeniable truth, then this could be a worth publishing. Otherwise, keep pushing forward with your endeavors! By the way, just out of curiosity, how far have you verified your results to?

    • @twilightstar7781
      @twilightstar7781  3 หลายเดือนก่อน +1

      @@Monkala2
      I feel I should reiterate that this video is meant to be an overview; it is deliberately light on the details, as it is meant to give more of a general sense of how it works. The paper gives more detail as to how I justify my claims.
      - "Can you demonstrate that, without bound, that the patterns you see will hold?"
      Yes; I discuss in the paper how these patterns naturally arise from properties of modular arithmetic.
      - "it looks as if all of your methods---your lower bounds, your invalid anchors, etc---only represent candidates for twin primes, not a guarantee that there will exist one"
      This is not the case. If a referent is valid within the appropriate critical area, it IS the host of a pair of twin primes. You have to be careful about the critical area, though. So if you're only looking at 5 and 7, then 3 is valid with respect to both of them. The critical area is at 8, and 3 < 8, so this referent is within the critical area. This means that 6*3 hosts a pair of twin primes (17 and 19); it doesn't say "there might be a pair here", and that happens to be true. These facts alone are sufficient to conclude that there is a pair of twin primes there. However, if you picked some x > 8, this is not enough information; you'd need to expand the critical area and the primes under consideration.

  • @caspermadlener4191
    @caspermadlener4191 2 หลายเดือนก่อน +1

    Happy to see you are both really humble and quite smart, which made this video a joy to watch!
    Two other people have already made the remarks I had about the mathematics, but I also think your use of "hexa", "referent" and "anchor" is unnecessary, and just make things a bit harder to read.
    You should have called these "6n±1", "n", and "6n", respectively, and consistent used n as the variable.
    If I give a name to something in mathematics, it's because I can't be bothered to write out something all the time.
    For example, instead of "a number that can be written as the sum of two squares", I used the term "happy number", in an essay necessary to finish high-school.

    • @twilightstar7781
      @twilightstar7781  2 หลายเดือนก่อน

      Well, thank you, I am glad I did not come off as arrogant, and I am glad it was enjoyable to watch as well :)
      As far as naming goes, I had the same idea, and I was trying to avoid "numbers of the form 6n ± 1", "the number to which a given hexa is adjacent" and "the number to which 6 is multiplied to get an anchor". I guess it made more sense to me as I was working on this.
      I will keep this in mind, and I encourage anyone who feels similarly to say as much, but I think I would prefer to keep it as-is for now unless others say they had the same issue.

  • @OutbackCatgirl
    @OutbackCatgirl 3 หลายเดือนก่อน +2

    I've commented on a recent numberphile video in hopes someone there spots it and humours taking a look, and even if it turns out to be an incomplete proof it will be a great opportunity to learn some funky weird fun mathematics quirks!

    • @twilightstar7781
      @twilightstar7781  3 หลายเดือนก่อน +1

      Here's hoping! I believe I emailed them too, so maybe something will come of it.

  • @Skeleman
    @Skeleman 3 หลายเดือนก่อน +3

    Email Matt Parker!
    He did a video involving the "all primes greater than 3 must be one more or less than a multiple of six" so he might be open to discussing it. He is also friends with many academic mathematicians sothis could be a way into getting more eyes on it.

    • @twilightstar7781
      @twilightstar7781  3 หลายเดือนก่อน +1

      I have, in fact. I emailed him and a few other channels for which I could find contact info for. I hoped to just spread it around as much as I could

    • @OutbackCatgirl
      @OutbackCatgirl 3 หลายเดือนก่อน +2

      ​@@twilightstar7781 anyone who sees this should consider commenting on matt's vids plus some other mathematicians if possible to try get more eyes on it!

  • @agranero6
    @agranero6 2 หลายเดือนก่อน +1

    *"all hexas bigger than 3 must be hexas"* is a tautology: you may want to say all hexas greater than 3 must be primes. If so he is pathetically wrong: 385 (6*64+1) is not. *If by hexadjacent you mean 6n+1 OR 6n-1* you are right is fairly simple to prove in a way far easier than you try (and your proof of this not valid for that part).
    Your language is flawed (see above) and you claim lemmas that you don't prove despite of being true.
    Sorry the rest does not worth reading the rest. Life is too short.
    Sorry for being blunt but I am not being 10% as blunt as a mathematician will be.
    Don't despair a lot of proofs Ramanujan sent to Hardy were wrong too.

    • @twilightstar7781
      @twilightstar7781  2 หลายเดือนก่อน +1

      - "all hexas bigger than 3 must be hexas" is a tautology
      Yes, I have since corrected that mistake. It should (and now does) read "all _primes_ greater than 3 are hexas".
      - If so he is pathetically wrong: 385 (6*64+1) is not. If by hexadjacent you mean 6n+1 OR 6n-1 you are right is fairly simple to prove in a way far easier than you try (and your proof of this not valid for that part).
      You'll have to explain this more. I don't really follow what you mean.
      - Your language is flawed (see above) and you claim lemmas that you don't prove despite of being true.
      I prove my statements in the paper. I said at the beginning that this video is an overview, and that more details are provided in the linked paper.
      - Sorry the rest does not worth reading the rest. Life is too short. Sorry for being blunt but I am not being 10% as blunt as a mathematician will be.
      This isn't being blunt, this is kinda just being rude. Pointing out a mistake I made (and have already corrected) is one thing. Saying I am "pathetically wrong", assuming you are referring to me in that sentence, goes beyond bluntness.

    • @caspermadlener4191
      @caspermadlener4191 2 หลายเดือนก่อน +2

      The proof that every prime number greater than 3 is either on more or less than a multiple of 6 is correct, actually.
      Also, mathematicians are really nice people, and not blunt at all.
      The fact that you write a "blunt" comment to somebody who is neither arrogant nor mean, is incredibly low. Do better.

    • @Kraflyn
      @Kraflyn หลายเดือนก่อน +1

      you are not arrogant. Mathematics is about precision. There is simply no other way to communicate in math. I liked the Ramanujan reference :3

    • @twilightstar7781
      @twilightstar7781  หลายเดือนก่อน +1

      @@Kraflyn Well, thank you :) I think that many would understand that I am trying to speak confidently rather than arrogantly, but I nevertheless prefer to clarify these sorts of things, especially since I have had cases in the past where my tone is misconstrued. From what I can tell, that has not happened here, and it's nice to see.

    • @Kraflyn
      @Kraflyn หลายเดือนก่อน

      ​@@twilightstar7781 It's about Twin Primes, you were an angel actually in your very mathematical and specific critique :D :3 What do they expect? It's Twin Primes ffs.... One of the ddepest questions in math.