@@jawadbenbrahim5933 my function as a much smaller infinity then the Taylor series. But infinity is an Infinity, and yes it would make sense for me putting infinity to the equation and calling it complete but I'm also doing it for computer graphics.
@@SilentALumethe function you have created converges to the function (1/2 (-2 EllipticTheta(2, 0, 16/e^4) - EllipticTheta(4, 0, 2/e) + (2 EllipticTheta(3, -π x, e^(π^2/(-1 + log(2)))) - EllipticTheta(3, -(π x)/2, e^(π^2/(4 (-1 + log(2)))))) sqrt(π/(1 - log(2)))))/(EllipticTheta(4, 0, 2/e)), this function is different from cos(pi*x) as cos(pi*x) = 0 at x = n + 1/2 for n beeing a integer. While your aproximation is zero at x = n + 1/2 + epsilon where epsilon is the error, epsilon is on the order of 10^-(10), that beeing said it seems the two functions have the same tops and bottoms, basically as R goes to infinity the margin of error goes to 10^(-10)
@@ivanb493he‘s using imaginary exponents. Those are automatically trig functions. If you think this suggestion would be cheating then the video is cheated too.
@@TotalTimoTimecheck again, i is the variable of the sum not an imaginary number. I guess it kind of looks like the Taylor expansion if you dissect it though. But that seems fair.
import matplotlib.pyplot as plt import math def apsin(x): pi = math.pi multy = 1 # in what interval is x in relation to the roots of cos(x) #inty = x / 2 # this gives an approximation of how many roots in front of x . sign = 1 if(math.ceil((x)/pi)%2 == 0): sign = -1 x = (x + pi/2 - (pi *math.ceil((x)/pi)))
inty = 1 return sign*(1/(9*pi*pi*pi*pi/16)) * (x+((inty+1)*pi)-pi/2)*(x+((inty)*pi)-pi/2)*(x-((inty)*pi)-pi/2)*(x-((inty)*pi)+pi/2) listy = [] liste= [] print(math.sin(360)) print(apsin(360)) for i in range(1000): print('ap') listy.append(apsin(i)) liste.append(math.sin(i))
I think part of the reason why the quadratic in the exponent helped in making the cos function approximation is because of the jacobi theta function. Basically the third order jacobi theta function is θ₃(z, q) = Σ[n=-∞,∞](q^(n²) * e^(2niz)). When z = 0, the imaginary part (e^(2niz)) disappears, so we get θ₃(0, q) = Σ[n=-∞,∞](q^(n²))., splitting this into two sums θ₃(0, q) = q^(0²) + Σ[n=1,∞](q^(n²)) +Σ[n=-1,-∞](q^(n²)) = 1 + Σ[n=1,∞](q^(n²)) +Σ[n=-1,-∞](q^(n²)), notice by symmetry of the sum ((-n)² = n²), Σ[n=1,∞](q^(n²)) = Σ[n=-1,-∞](q^(n²)), therefore we have θ₃(0, q) = 2*Σ[n=1,∞](q^(n²)) + 1, therefore Σ[n=1,∞](q^(n²)) = (θ₃(0, q) - 1)/2. Then for a lot of these sums that appear we can just express the sum in terms of the θ function. By substitution and rearranging we can express many if not all the sum terms in terms of the theta function. The jacobi θ function is essentially an elliptic analogue of the exponential and does exhibit quasi-double periodicity, basically it means that the periodicity goes out to two dimensions and only roughly follows the periodic nature so f(z+u) and f(z+v) may not equal f(z) exactly (for this u and v are linearly independent), but there still follows a trend. Though because the imaginary part is removed it is only singly quasi-periodic hence yielding the cos approximation. Sorry if I made any mistakes make sure to tell me. en.wikipedia.org/wiki/Doubly_periodic_function mathworld.wolfram.com/JacobiThetaFunctions.html en.wikipedia.org/wiki/Quasiperiodicity
True but just think about how the numbers involved here don't blow up. This can produce an amazing aproximation with as little as 6 terms to infinity with wrapping.
@@gamerboy7224 please do because I might be missing something. What I did to extend it was this: (x - (2pi × floor(x/(2pi)))/pi. Replace x with that and it will use the single cycle for the whole domain.
you can actually perfectly recreate cos(x) 1 to 1 via taking the real of i^(pi*x/2), which would look like real(i^(pi*x/2)), but you have to make sure you toggle on complex mode in the settings first
Great work here! To me, it seems that you've derived a quirky Fourier-Poisson approximation with a mainly hyperbolic cosine approximation. I think one of the more concrete places to start would be the complex exponential definition of trigonometry, and approximate that, instead of doing visual approximation. Overall, great job though!
I was thinking this might have applications to like work out cos quickly without a calc until I saw the final equation XD it feels like a taylor expansion anyways but in the most roundabout way possible
Instead of adding parabolas in the beginning, you could multiply them y = (1-(x/0.5pi)²)*(1-(x/1.5pi)²)*.. and that will be exact as you add infinite factors...
the (D^ix + D^-ix) /2 type stuff you have going on in there is literally just the definition of cos for D=e. its not exactly what you have but something like that is going on there
There is a channel that made a really fast sim function for the n64. The channel is called Kaze emanuar. His isn’t as accurate but I think he used what he calls a folded polynomial to get different parts of it
Divide the input by 2pi and take the remainder. Then you only need to approximate that little range from 0 to 2pi, and every other input will work too. It's what computers actually do when they calculate these functions. It's called modular division and range reduction, and it's used everywhere. You can actually do more than 2pi because between 0 and pi is symmetrical to the part between pi and 2pi. And between 0 and pi/2 is also symmetrical to the part between pi/2 and pi. That leaves you with a tiny little piece of curve. And if you can approximate it, you get the rest of the values everywhere else. I tried a quadratic and got -0.35*x^2 -0.1*x -1. Looks pretty close by eye. I'm sure it's possible to do infinitely better of course. Computers can break it up into many segments, with lookup tables for the best fitting polynomial for each segment. You can do even better than that, but lookup tables and polynomials are very fast to compute.
You might have just accidentally created an actual expansion of cos, especially with using e in your constant (cos(x) = (exp(ix)+exp(-ix))/2 after all). Also, the second sum of 1 from 1 to R is just equal to R, the sum is unnecessary.
not trying to be rude, but you can click the home button under the zoom out button to go back to default center zoom, y'know? not sure if you knew but seeing you struggle to zoom back in (specifically between 0:18 and 0:30) was kinda painful lol
as people probably already said. the cos function is very closely related to the exponential function. And you probably made the vanishing part of the gaussian go to zero with the limits
Ok this is cool, but by using the exponential function to approximate a trigonometric function (cosine), aren't you effectively approximating a trigonometric function with a trigonometric function? Since cos(x)=(1/2)(e^(ix)+e^-(ix)) and your equation looks oddly similar, in the sense that it is in the form of (the sum of) (e^f(x) - e^g(x))/c + some error, where the error decreases as I increases. I don't feel like attempting to prove it, but this looks like some sort of taylor expansion of Euler's formula. I might be wrong of course.
from math import pi def cos_approximations(x): R=10 total=0
e=2.71828182846 D=e/2 for n in range(1,R+1): d1=-D**(-(x/pi+1-2*n)**2) d2=D**(-(-x/pi+1-2*n)**2) d3=2*D**(-(2*n-1)**2) denom=0 for i in range(1,R+1): NUM=D**(8*i-4)+1-2*D**(4*i-1) DENOM=2*D**(4*i*i) denom+=NUM/DENOM total+=(d1-d2+d3)/denom+1/R return total print(cos_approximations(1)) #--> 0.5403023059153995 translating bros formula in python code 💀
also, this is what the original cos sin functions look like in python code: from math import pi, factorial def cos(x): x=x%(2*pi) total=0 for n in range(10): total+=((-1)**n * (x**(2*n)))/(factorial(2*n)) return total def sin(x): x=x%(2*pi) total=0 for n in range(10): total+=((-1)**n * (x**(2*n+1)))/factorial(2*n+1) return total
It seemed quite obvious that what you was actually doing was to generate an alternative (and much more complicated) way to express the Taylor series? I clicked on this video expecting to see a fast approximation, useful for things like games/computer graphics, and in the beginning, this seemed like what you were doing, but then wham! 🙂
Now prove that as R→∞, your function becomes the cos function.
@@jawadbenbrahim5933 my function as a much smaller infinity then the Taylor series. But infinity is an Infinity, and yes it would make sense for me putting infinity to the equation and calling it complete but I'm also doing it for computer graphics.
@@SilentALume ngl you still need that x/pi tho
@@SilentALumethe function you have created converges to the function (1/2 (-2 EllipticTheta(2, 0, 16/e^4) - EllipticTheta(4, 0, 2/e) + (2 EllipticTheta(3, -π x, e^(π^2/(-1 + log(2)))) - EllipticTheta(3, -(π x)/2, e^(π^2/(4 (-1 + log(2)))))) sqrt(π/(1 - log(2)))))/(EllipticTheta(4, 0, 2/e)), this function is different from cos(pi*x) as cos(pi*x) = 0 at x = n + 1/2 for n beeing a integer. While your aproximation is zero at x = n + 1/2 + epsilon where epsilon is the error, epsilon is on the order of 10^-(10), that beeing said it seems the two functions have the same tops and bottoms, basically as R goes to infinity the margin of error goes to 10^(-10)
Welcome back Ramanujan
💀💀💀
lol
I feel like this video went from "oh huh I see where he's going with this" to "what the fuck" in the span of 0.2 seconds.
2:40 went from "what the fuck" to "what the actual hell is wrong with your brain to think this is fun"
This man went from watching 3blue1brown to graphing complex equations with custom colors in 3 seconds
thats what happens when u watch 3b
you become a new man
Wait until he finds out about sin(x+π/2)
☠️
thats cheating ;p
@@ivanb493he‘s using imaginary exponents. Those are automatically trig functions. If you think this suggestion would be cheating then the video is cheated too.
@@TotalTimoTimecheck again, i is the variable of the sum not an imaginary number. I guess it kind of looks like the Taylor expansion if you dissect it though. But that seems fair.
@@leeroyjenkins0 ok now i see it
I was not expecting to get even that close.
I'll try to make something better
import matplotlib.pyplot as plt
import math
def apsin(x):
pi = math.pi
multy = 1
# in what interval is x in relation to the roots of cos(x)
#inty = x / 2 # this gives an approximation of how many roots in front of x .
sign = 1
if(math.ceil((x)/pi)%2 == 0):
sign = -1
x = (x + pi/2 - (pi *math.ceil((x)/pi)))
inty = 1
return sign*(1/(9*pi*pi*pi*pi/16)) * (x+((inty+1)*pi)-pi/2)*(x+((inty)*pi)-pi/2)*(x-((inty)*pi)-pi/2)*(x-((inty)*pi)+pi/2)
listy = []
liste= []
print(math.sin(360))
print(apsin(360))
for i in range(1000):
print('ap')
listy.append(apsin(i))
liste.append(math.sin(i))
print(apsin(i))
print(math.sin(i))
plt.plot(listy)
plt.plot(liste)
plt.show()
i made a sin function by accident
Sigma
Cosine = e^(ix).real
Sine = e^(ix).imag
You can also convert this to:
i^x.real = cos(2x/pi)
i^x.imag = sin(2x/pi)
2:43 “this video is gonna take about 2π”
turns out the video length is just about 6:28 haha
Lol. Prolly intentional.
2pi minutes is around 6:17 as a youtube timestamp
@@uggupugguwhat?
@@jesp9435let him cook
@@uggupuggu mmmmh....
2:45 where the trivial stuff begins
thanks 🥰🥰🥰
This video is perfection--really like the 6:28 duration.
after you zoomed out at 0:12 ... i instantly went y=0 will do xD
Bro went from "sooo so close" to an entire mathematic documents that exist
fr
cosinus truly was the euqations we made along the way
I think part of the reason why the quadratic in the exponent helped in making the cos function approximation is because of the jacobi theta function. Basically the third order jacobi theta function is θ₃(z, q) = Σ[n=-∞,∞](q^(n²) * e^(2niz)). When z = 0, the imaginary part (e^(2niz)) disappears, so we get θ₃(0, q) = Σ[n=-∞,∞](q^(n²))., splitting this into two sums θ₃(0, q) = q^(0²) + Σ[n=1,∞](q^(n²)) +Σ[n=-1,-∞](q^(n²)) = 1 + Σ[n=1,∞](q^(n²)) +Σ[n=-1,-∞](q^(n²)), notice by symmetry of the sum ((-n)² = n²), Σ[n=1,∞](q^(n²)) = Σ[n=-1,-∞](q^(n²)), therefore we have θ₃(0, q) = 2*Σ[n=1,∞](q^(n²)) + 1, therefore Σ[n=1,∞](q^(n²)) = (θ₃(0, q) - 1)/2. Then for a lot of these sums that appear we can just express the sum in terms of the θ function. By substitution and rearranging we can express many if not all the sum terms in terms of the theta function. The jacobi θ function is essentially an elliptic analogue of the exponential and does exhibit quasi-double periodicity, basically it means that the periodicity goes out to two dimensions and only roughly follows the periodic nature so f(z+u) and f(z+v) may not equal f(z) exactly (for this u and v are linearly independent), but there still follows a trend. Though because the imaginary part is removed it is only singly quasi-periodic hence yielding the cos approximation. Sorry if I made any mistakes make sure to tell me. en.wikipedia.org/wiki/Doubly_periodic_function mathworld.wolfram.com/JacobiThetaFunctions.html en.wikipedia.org/wiki/Quasiperiodicity
I literally just learned in my math class yesterday lol
@@John-cl8iv2 Oh cool what class is that?
@@BaukBauk9491 Wait never min I learned a Jacobian in calc 3
That was a nice digestible explanation. Well done
@@BaukBauk9491 you learn about theta functions in complex analysis right?
Close enough, welcome back Srinivasa Ramanujan
Man was I a fool to think that when I clicked on this video it was gonna be about anything I understand.
You are implicitly using e^ix, which itself encodes the desired results from Euler's formula.
I thought so too, but 'i' is not being used as the imaginary unit. It comes from the summation.
4:36 bro the music is pi!!! that's how i memorize it so i recognized immediately, this is awesome!
sin(x+π/2) is a decent Approximation if you ask me
nah no one asks
I don't want to use sine
sin(x-π/2) = -cos(x), not cos(x)
@@plenus2017 shut up,. thankyou
Absolute cinema
Wait until bro discovers taylor series 💀💀💀
hahah literally what i thought
Dumbass he literally said it in the beginning
True but just think about how the numbers involved here don't blow up. This can produce an amazing aproximation with as little as 6 terms to infinity with wrapping.
@@Tabu11211 6 terms only makes this approximation valid for around |x|
@@gamerboy7224 please do because I might be missing something. What I did to extend it was this: (x - (2pi × floor(x/(2pi)))/pi. Replace x with that and it will use the single cycle for the whole domain.
unbelievable work.
im to dumb to understand the process but looks like you made a hard work on this one 🔥🔥🔥
close enough, welcome back Ramanujan
YOU FINALLY GOT A VIDEO THAT WENT SEMI VIRAL YESSSS
Obviously, very trivial stuff really
his cos function flies away at x=-195 and x=195
Nintendo (1996) hire this man
you can actually perfectly recreate cos(x) 1 to 1 via taking the real of i^(pi*x/2), which would look like real(i^(pi*x/2)), but you have to make sure you toggle on complex mode in the settings first
your last term can just be simplified as "R" you don't need the sumation of 1 from 1 to R
Great work here! To me, it seems that you've derived a quirky Fourier-Poisson approximation with a mainly hyperbolic cosine approximation. I think one of the more concrete places to start would be the complex exponential definition of trigonometry, and approximate that, instead of doing visual approximation. Overall, great job though!
I was thinking this might have applications to like work out cos quickly without a calc until I saw the final equation XD
it feels like a taylor expansion anyways but in the most roundabout way possible
Instead of adding parabolas in the beginning, you could multiply them
y = (1-(x/0.5pi)²)*(1-(x/1.5pi)²)*..
and that will be exact as you add infinite factors...
I have no clue how you did anything, but this is the type of smart I aspire to be
In desmos, it's really easy to get an approximation of the cosine function: cos(x)
Bro just bruteforce the taylor series
no. the taylor series is much less efficient than this
this is the math analogy of "doing a little mining of camera"
Taylor series: Really bro?
The design of the cosmos (simplified)
Rung the bell. Love this exploratory chaos.
bro forgot that adding parabolas gives you back a parabola
the (D^ix + D^-ix) /2 type stuff you have going on in there is literally just the definition of cos for D=e. its not exactly what you have but something like that is going on there
The co-sine function
"tHe cOS fUnCtiOn" it sent vibrations down my spine
I don't know much math and have no idea what you were doing but this is really entertaining
There is a channel that made a really fast sim function for the n64. The channel is called Kaze emanuar. His isn’t as accurate but I think he used what he calls a folded polynomial to get different parts of it
plugging this instead of trig functions, to avoid trig in pre calc
@@thomasbeaumont3668 lol
could've put down a sine function and shifted the phase and that would've been a perfect "approximation" lol
Very nice!!
sin(x+pi/2)
Me when I can't figure out how to simplify my answer in an exam
So that went from 0 to 100 real fast
Divide the input by 2pi and take the remainder. Then you only need to approximate that little range from 0 to 2pi, and every other input will work too. It's what computers actually do when they calculate these functions. It's called modular division and range reduction, and it's used everywhere.
You can actually do more than 2pi because between 0 and pi is symmetrical to the part between pi and 2pi. And between 0 and pi/2 is also symmetrical to the part between pi/2 and pi. That leaves you with a tiny little piece of curve. And if you can approximate it, you get the rest of the values everywhere else.
I tried a quadratic and got -0.35*x^2 -0.1*x -1. Looks pretty close by eye. I'm sure it's possible to do infinitely better of course. Computers can break it up into many segments, with lookup tables for the best fitting polynomial for each segment. You can do even better than that, but lookup tables and polynomials are very fast to compute.
4:55 You never needed pi to go over the circle though...
You might have just accidentally created an actual expansion of cos, especially with using e in your constant (cos(x) = (exp(ix)+exp(-ix))/2 after all).
Also, the second sum of 1 from 1 to R is just equal to R, the sum is unnecessary.
Its because I wanted to be a whole number
Then the floor function might be your go-to ig
@@SilentALume you can make the slider go up by 1s
@@SilentALume use the “step” option when you get into the slider range editor and make it 1 is what the above meant.
Video so good approximation is a key concept twice
2:44 this is when shit gets serious
u should do some exploration on y=sqrt(24x+1) there are beautiful patterns in the primes of the function.
idk if im high or something but the formula in the thumbnail looks like a 50 bmg sniper rifle
idk man i think using cos would've been easier
another video with some more explanation of the process/your thinking would be awesome!
"if R goes to infinity what is D?"- SilentALume
bros video actually blew up
well done!
not trying to be rude, but you can click the home button under the zoom out button to go back to default center zoom, y'know? not sure if you knew but seeing you struggle to zoom back in (specifically between 0:18 and 0:30) was kinda painful lol
Truly remarkable 👏🏻
Bro spent so much time approximatting cosine function, while I easily got an ideal aprroximation with sin(x + π/2)
that wasn’t obvious for me
Didn't expect to see you here
as people probably already said. the cos function is very closely related to the exponential function. And you probably made the vanishing part of the gaussian go to zero with the limits
now integrate it
Perfectly normal Asian behaviour.
Subscribed just because of this xD
bro was DESPERATE
taylor expansion enters the chat
Wait until he discovers Taylor series
wait until this guy finds out about taylor series
take the derivative of it and see if it's an approximation of -sin
anything but the taylor series
Thats impressive and also genius
instead of writing the sum of 1 for "I" that goes from 1 to R (in the denominator of the last term), you could've just written R
good luck. i couldnt even find out the sin function
this is insane
bro has the weirdets time clock
Fun desmos tip is you can make functions like
T(x. y, y) = x * y * u
Or without the static-like function
x1 = 2
y1 = 5
u = 4
T = x1 * y1 * u
'use the long method'
I love how the video was 2pi long (2:44)
Ok this is cool, but by using the exponential function to approximate a trigonometric function (cosine), aren't you effectively approximating a trigonometric function with a trigonometric function? Since cos(x)=(1/2)(e^(ix)+e^-(ix)) and your equation looks oddly similar, in the sense that it is in the form of (the sum of) (e^f(x) - e^g(x))/c + some error, where the error decreases as I increases. I don't feel like attempting to prove it, but this looks like some sort of taylor expansion of Euler's formula.
I might be wrong of course.
1:52
Bell-shaped function
en.wikipedia.org/wiki/Bell-shaped_function
tell me someone who doesn't love the sigma function
also another interesting way to do this kind of thing is with the bell curve. i may be wrong am an engineer student
This is what i do for fun
Just use Sin
all I need to approximate the sin function is to pull up to the crib w some baddies. The whole function be sinning.
Maclaurin and Taylor: 💀
you can just use a taylor series
from math import pi
def cos_approximations(x):
R=10
total=0
e=2.71828182846
D=e/2
for n in range(1,R+1):
d1=-D**(-(x/pi+1-2*n)**2)
d2=D**(-(-x/pi+1-2*n)**2)
d3=2*D**(-(2*n-1)**2)
denom=0
for i in range(1,R+1):
NUM=D**(8*i-4)+1-2*D**(4*i-1)
DENOM=2*D**(4*i*i)
denom+=NUM/DENOM
total+=(d1-d2+d3)/denom+1/R
return total
print(cos_approximations(1)) #--> 0.5403023059153995
translating bros formula in python code 💀
also, this is what the original cos sin functions look like in python code:
from math import pi, factorial
def cos(x):
x=x%(2*pi)
total=0
for n in range(10):
total+=((-1)**n * (x**(2*n)))/(factorial(2*n))
return total
def sin(x):
x=x%(2*pi)
total=0
for n in range(10):
total+=((-1)**n * (x**(2*n+1)))/factorial(2*n+1)
return total
It seemed quite obvious that what you was actually doing was to generate an alternative (and much more complicated) way to express the Taylor series?
I clicked on this video expecting to see a fast approximation, useful for things like games/computer graphics, and in the beginning, this seemed like what you were doing, but then wham! 🙂
hey! great vid, just want to ask what is the program you are using to annotate/draw and make text boxes during the timelapse?
Thank you.
2:48 Yoo, is that the fibonacci music?
Going around a circle using e
2:44 the video is in fact 2 pi
2:50 this is why i quit desmos, too much sweats
Here's a fun question: How can you prove your function is similar to cos(x) without desmos? 🙂
before 2:43 i had an idea as to what was actually going on but boy was i wrong