Finally a video on Curry-Howard equivalence! A nice, simple way to introduce it to boot. This was a bit of a mind-blower when I first stumbled upon it and really understood what it meant (being reasonably competent in both mathematics and programming at that point). One minor nitpick, you could've probably named the thing to make it easier to google for.
Relax, dear. You seem to be awfully worked up about this. Drink some chamomile tea and reflect on shortness of life and if you need waste what little time you've got on petty attacks over a video on the Internet.
About Vladimir Voevodsky's work there's an excellent article from Quanta Magazine written in May 2015. It's about replacing the foundation of maths (set theory) by univalent foundations (type theory), and how then that helps to automate maths proofs.
Wow, it's so weird seeing this video after discovering these concepts on my own a few months ago for a project that I have been working on. What I discovered is that predicate logic, algebra and type theory all have a very similar structure: not ⊂ incrementation ⊂ option type (nullable value) xor ⊂ addition ⊂ sum type (enum + union) and ⊂ multiplication ⊂ product type (structs/tuples/cons cell) implication ⊂ exponentiation ⊂ function type (functions/arrays/dictionaries...) false ⊃ 0 ⊃ bottom type ("exception") true ⊃ 1 ⊃ unit type ("void") true ⊃ ... true ⊃ ∞/variable ⊃ top type (untyped variable) From this I also discovered: - Predicate logic is a restriction of algebra, where every operation is performed modulo 2. - Algebra is a restriction of type theory, where every operation is performed on the number of possible states each type describes. - Type theory doesn't need inverse functions like in algebra (decrement, subtraction, division, root, logarithm) because it is completely constructive. I've tried imagining what hyper-operations after exponentiation would mean, but I couldn't think of anything useful other than that it may be related to mapping lower dimension types to higher dimension types (point => line => circle => sphere => ... n-sphere).
void is the bottom type, not exception. Generally, any type that has no elements is an empty type (bottom). The unit type has one element, in Haskell it's ().
@TheFrozenCompass Void is by definition not a unit type. A unit type is a singleton whereas void has no elements; it's a uninhabited set. Haskell has void and unit
Very well explained! Here is a simple Haskell implementation. {-# LANGUAGE TypeOperators #-} type P = Bool type Q = Bool type R = Bool type a × b = (a, b) type a + b = Either a b f :: P × (Q + R) -> (P × Q) + (P × R) f (x, Left y) = Left (x, y) f (x, Right z) = Right (x, z)
Thanks for this. Do you also have a Haskell explanation for the “Left of y” and “Right of z” parameters; do they just mean of type y : Q and z : R? If they are all Boolean, why have lefts and rights?
An Either a b type constructor takes two types and return a type which can be either a or b. The values are Left (something of type a) or Right (something of type b) For example, say we had a Haskell function nthPrime :: Int -> Int which takes an integer n and returns the nth prime. If we decide that if we get a negative number for n (and thus the nth prime is nonsensical) we want to return an error message instead, how would we do that if we said the function returns an Int? Well, one way is to use Either. Define nthPrime :: Int -> Either Int String. Then if they give us valid input, e.g. n=5 we just return Left 11, to specify that the input was valid and we returned an Int. If the input is invalid we can return Right "Hey, only positive numbers are allowed!". They don't even have to be different types. You can have an Either String String to represent a valid response vs an error message, which I think pertains more to your question. Either Bool Bool would let us differentiate between P and Q.
There is no need to prove the statement just for Bool, you can leave it fully parametric: type a * b = (a,b) type a + b = Either a b f :: p * (q+r) -> (p*q) + (p*r) f (x, Left y) = Left (x, y) f (x, Right z) = Right (x, z)
Most often, we write our programs as value-level logic. We gain type-checking if we write them as type-level logic. Type-checking gives confidence and maintainability to programs.
Some good stuff on computerphile lately. You should probably link to the paper, though. Also, if anyone wants to actually play around with this: Take a look at Agda or Coq. Those are languages specifically made to do this. Some other functional languages like Haskell and Idris can also do this, but especially in Haskell, it's a bit of effort.
I worked with a language in university, coq, that allows one to write provably correct programs. The syntax being shown is very similar, I wouldn't be surprised if professor Altenkirch is one of the individuals working on it.
I didn't fully understand, but this area seems very interesting to me. I am a cs student generally interested in areas relating to mathematical logic so these seems like a perfect fit for me. I will be reading more into this.
I used to be really exhausted watching his episodes, trying hard to understand when I had had one class of C++ as a mechanical engineer. Now, maybe one semester while getting into programming via different C++ conference talks and other content on programming (particularly Adobe's senior principal scientist Sean Parent's talks have been eye opening), getting introduced to a couple of different languages, I've started to understand more fundamental problems in programming that every language is trying to solve a different way, these are really fascinating topics. How to show that your function is reliable for concurrency and parallelism or how can you try to become confident that it doesn't have a critical bug, perhaps in a setting where you can't really test the function/program. What kind of structures provide confidence.
What I got out of it: If something is not provable it is not classically computable therefore use a mathematical logic that represents the idea of computability for computer science applications?
One thing I have noticed from the comments in this video, is that everyone is from a different background. Some are programmers, some are mathematicians, some are logicians and some have no background in the things you covered.
In case no one else noticed, he essentially constructed the category whose objects are propositions and whose morphisms are implications between propositions. Indeed, conjunctions (ands) are products in this category, and disjunctions (ors) are coproducts, what he calls ‘+’ here where it’s basically a type of disjoint union.
This question means the world to me right now if and if could be sufficiently answered or engaged with, would make me very grateful; does this correspondence imply at all that complex types and their algebra can be represented fully with Boolean values and their operators?
So basically it is better to use propositions as types than propositions as bools, because if you use propositions as bools, it will check all the possibilities for the statement, but if you use types it will give the proof and then decide either it's true or false. It's faster and more general. Right?
Well, propositions as bools is more powerful, as was demonstrated. However, with propositions as types you can embed this into your programming language and have it machine-checked. You could have the compiler prove that your sorting algorithm does indeed sort the input without having to rely on tests which may or may not find a bug. Another great thing is that it avoids "boolean blindness". Your proof does not just return "true" or "false" but rather "the statement X is true/false". It contains a reference to what was actually proven. (Also known as evidence) This allows you to actually use the proof in other proofs, which is rather useful. (Though, with bools, you can't do any of this anyway, so it might a bit unfair to put this on top)
+Yndostrui There exist formal systems in classical logic, that allow you to use proofs in other proofs, as well as those that can be machine checked. In these aspects, homotopy type theory doesn't do anything new, it has just done it a nice way.
At 13:11. Symbol "x" appears with three entirely different meanings: as an operator, as the expression that the predicate/function operates on, and as the first of two arguments to the function. (However, "?" is clearly defined.) I was able to follow the proof, but it would have been easier if different symbols had been used.
Yes… but I did pause to read the Wikipedia article on Zermelo-Fraenkel set theory. And being familiar with the idea of foundations of mathematics (Peano Arithmetic, Russel's goal of set theory/logic based mathematics), propositional logic, predicate logic and theory of computation probably helped me a lot in understanding it. I can imagine this being kindy tricky if you do not know about these things.
All I saw was the commutative property over and over again. I wasn't sure if he was saying any more than that.
4 ปีที่แล้ว
One day, I think Cubical Type Theory (the constructible version of Homotopy Type Theory) will be sufficiently developed to be able to do physics on it, that is, one day they may be able to re-formulate the big theories like quantum mechanics and relativity in cubical type theory and thus produce the so-called "theory of everything" - that is how big type theory is. For now it is a way to make better / more secure API's, and being able to catch bugs at compile time. But in the future type theory will be to our civilization what geometry was to the Greeks, nothing less than our greatest achievement. Mark my words.
What is this topic of mathematics called and where can I read more about it? In particular, I’m curious how it connects with the type systems of functional languages like OCaml.
I’m confused at minute 15:03 where he’s explaining why the Law of Excluded Middle does not hold in this isomorphism. He says we must decide if all predicates are either true or all predicates are false, but aren’t we showing that all predicates are either true/false? As in the prime number example given, where for some Natural Numbers its going to be true, while for others it will be false. Does anyone have an explanation?
If you assume prop=bool, that means each proposition is either True or False. When we get into programs-as-proofs, we run into one tiny problem. There's a class of propositions that are, within the system, neither True nor False, but unprovable. Reason being, the system only allows for constructive proofs, and not all possible proofs are constructive (also Gödel's theorem, there can be theorems that are, given a set of axioms, unprovable in principle, so classical logic doesn't make all that much sense anyway)
Yes, mathematical induction corresponds to primitive recursion, i.e. to write a function form the natural numbers it is enough to say what the function is doing for 0 and what it returns for n+1 if you assume that you already know what it returns for n. Prop = bool doesn't work because as I have said we cannot interpret the quantifiers as programs on the booleans.
That's actually pretty much how discriminated unions work in Haskell, on the left of = you need to pull it out of the union, and on the right you need to construct one.
I know I'm a few years eight but can someone please caption this video. The concept is super cool but it's pretty difficult to understand his accent when distinguishing important words like "two" and "true".
it's an interesting topic. but i honestly think the explanation can use some work. some more examples showing the correspondence between proofs and programs would be nice (where all the steps are shown). i also don't think everyone is familiar with functional programming types and proof trees (so a lot of inbetween steps or transformations seem to be missing)
@@Rockyzach88 this video seems to skip past a lot of the introductions, compared to other videos on this channel. making it harder to take an interest
No comment thread on your article glorifying mathematics. The way mathematics is taught and programming is learned are vastly different. In mathematics, there is a solution and one thousand ways not to get it. In programming, save some cases, there are many ways to devise a solution and only a few ways to not get it. You can be a bad mathematician, but a great programmer. The converse is also true. You don't need to know a single math theory, or a computational one, to solve many computational problems. You only need to hone problem solving and design skills, but practicing mathematics likely helps. Lastly, why reinvent the wheel when copypasta exists?
so what you are saying is, that in propositional logic the logical law of excluded middle (A or not A) is unprovable? Makes me wonder, is this a case of Godel's incompleteness theorem, where the statement is true but unprovable? Or maybe, the more worldview-shattering and it actually demonstrates that the law of excluded middle is not always true and boolean logic merely happens to be an area where it is...
There are a few logic systems where law of excluded middle is unprovable in the general case. Notably, constructive logic (also known as intuitionistic logic) allows having proof for excluded middle on case-by-case basis, but it's impossible to prove in the general case. Some logic systems replace law of excluded middle with negation as failure, meaning if something's impossible to prove to be true, it must be false, and it is a bit of a way out.
I am saying that if we base logic on evidence instead of truth then the principle of excluded middle is not valid. Because in this case we would have to give either supporting or negative evidence for any proposition. And if we use propositions as types then we are not even allowed to look into a type hence it is clearly impossible.
Yes, the axiom of choice also is not justified by the propositions as types explanation (if you do it right). However, Diaconescu proves that the axiom of choice implies excluded middle. The opposite is dose not hold, hence the axiom of choice is actually a stronger non-constructive principle.
Constructive logic. (P ∨ ¬P) isn't an axiom, you need to prove it on case-by-case basis. Not provable in general case. Propositions can be "unprovable" (that is, neither "true" nor "false").
Another name for constructive logic is intuitionistic logic. It started with the mathematician Brouwer. It is somewhat weaker than classical logic, where "exclusion of the middle" holds.
To do that you would have to prove that 2 is an element of the return type. But, when the types are variables that isn't possible because it would have to hold for any type. (Parametricity!)
It occurred to me that computers might have difficulty with things like, "This statement is false". However, one could think of it as saying, "This argument is not what it is" or "1 =/= 1" Any thoughts?
John Hightower it depends on the meaning you give to "This statement". let's call "This statement" "f(x)". then "This statement is false" is the proposition: f(x) = ¬f(x) then you can give it 2 interpretations 1. if "f(x)" is just a meaningless symbol to you, then you get a simple contradiction as a proposition: P = ¬P which simply has the value false 2. but, if you you interpret "¬f(x)" as the definition of "f(x)", then you get an infinite recursion. which computers have problems with of course: f(x) = ¬f(x) = ¬(¬f(x)) = ¬(¬(¬f(x))) ... etc in this way, the sentence "This statement is true" has the same infinite recursion problem because of self reference. recursion is supposed to have a decision about whether to continue or end (or it's just an infinite loop). computers have problem with it. because it does not make sense. the program is trying to do something with no hope of an answer. you can see it as computers having difficulty but you could also see it as us having difficulty telling them what to do (it's a blurry line)
What is a program? Not so fast, just glancing over the question omits several interesting distictions. For example: is a program a collection of bits and bytes on disk or tape? No, that’s a file (a program file, sometimes even an archive file of a program a la a.out or helloworld.a). So then, is a program a collection of bits and bytes in memory? Actually, we have a word for that: a process, a task, a thread, a fork etc. Is a program the abstraction of all of the above? The collection of algorithms in their purest, mathematical form? I don’t think so. It would take it away too far from what’s actually happening in the computer, GUI and bugs and warts and all. Well, then: what *_is_* a program?
I am a software developer and I do not understand this man's explanations - he may be fantastically intelligent however he needs to brush up on teaching, a first step would be understanding how people have different ways of learning.
It's rather straightforward actually. You do need to know some basics of type theory though, so previous prof. Altenkirch's video is recommended (or just read some books on the subject, B. Pierce's "Types and Programming Languages" is a good one if you need a recommendation)
It's actually very simple. It's just that this dude cannot teach. If you want to have this explained in a straightforward and understandable way, watch Philip Wadler's "Propositions as Types" from Lambda Days or Strange Loop.
That last part didn't make sense.....P+P->0, is that set up to state "p or (p implies 0)" which is always true, or is it "(P+P) implies 0" which is just ¬P. The equations are extremely specific and have a solution, yet the professor makes it seem like it is infinite recursion of some kind? Implication is simply the formula: A->B = ¬A∨B At least in the programming sense.
It's *[P + (P -- > 0)]* , and yes he should have included the parentheses. However, your assertion that this statement _"is always true"_ is misguided. The statement is always true in Logic Systems where a proposition cannot be neither true nor false, e.g. classic Boolean Logic aka the "proposition == bool" system. However, the "proposition == type" system is not such a system. It can't be *shown* using the elements of the "proposition == type" system -- that is, using the Cartesian Product, the "OR Plus" by lack of a better name, functions and the Empty Type representing AND, OR, IMPLIES and NOT respectively -- that the proposition *[P OR (NOT P)]* , aka *[P + (P -- > 0)]* , is a tautology. This is entirely unsurprising, because in the "proposition == type" system, it *isn't* a Tautology. For the most basic example of this, let's examine the Predicate equivalent of the alleged Tautology, which is *[Q: FORALL x, Q(x) \/ ~Q(x)]* . For this statement to be a Tautology, it would have to hold regardless of what "Q" is. Now, if the domain of Q is taken to be the Natural Numbers _[i.e. 'x' is limited to the Natural Numbers]_ and is defined as *[x is Prime]* , then the statement *[FORALL x, Q(x) \/ ~Q(x)]* is indeed a Tautology. However, if instead Q is a Predicate on the Domain of programs -- or Turing Machines if your prefer -- and the definition of Q is *[x Halts]* , then the statement *[FORALL x, Q(x) \/ ~Q(x)]* is NOT a Tautology in Type Theory, as it's undecidable whether a given program eventually halts or not. This is what the professor meant when he said that *[P \/ ~P]* is not always true in Type Theory / the "proposition == type" System of Logic. Yes, it's true that *[P -- > Q]* is equivalent to *[~P \/ Q]* within the context of *Boolean Logic!!* But, again, we are not talking about Boolean Logic here, but about Type Theory (Logic). Equating propositions with types quite obviously doesn't assign truth values to propositions and doesn't use Boolean operators to equate to propositional operators. That, AGAIN, would be Boolean Logic. Type Theory / - Logic / whatever you want call "proposition == type" Logic assigns *types* to propositions and uses *type operators* to equate to propositional operators. The type operator that equates to Implication is the Function, which takes an input with the type of the left side of the Implication operator, and outputs something with the type of the right side of the Implication operator. Moral of the story: you're confusing Boolean Logic with Type Logic. This video is about Type Logic, and that would explain your confusion.
Ahsim Nreiziev You have to be careful about part of that though. Intuitionistic logic indeed does not in general prove "P or (not P)", but it DOES prove "not not (P or (not P))". "not (P or (not P))" is enough to prove "not P", as given "P" you could prove "P or (not P)", and using the "not (P or (not P))" this would allow you to prove False. So, given "not (P or (not P))" you can derive "not P". Using a similar method, you can also derive "not not P". Take as input "not P", and from that derive "P or (not P)". Then, using "not (P or (not P))" you can derive False. This works to show that "not not P" can be proven, if assuming "not (P or (not P))". So, "not (P or (not P))" is enough to prove both "not P", and "not not P". "not not P" allows you to derive False from "not P", so these two together let you prove that False. So, therefore, from "not (P or (not P))" you can derive False. This constitutes a proof of "not not (P or (not P))". Generally, if ordinary logic proves that "P", intuitionistic logic will at least prove "not not P", even if it can't prove "P".
A proposition is a statement such as "all even numbers divide by 2". By definition it is either true or false. Why are you trying to redefine it as something it isn't? He isn't talking about propositions at all, he is talking about a theorem, which is not the same thing. This isn't a mathematics issue, it is an English issue. Stop trying to redefine the damn language please.
He mentioned ∀ being a part of the language, he just used concrete P, Q and R in his example to keep things simple. Here is his example with universal quantification. (The program/proof is the same.) f : ∀p q r. p×(q+r) → p×q+p×r f (x, left y) = left (x,y) f (x, right z) = right (x,z) Here is a proposition that is false, or type for which there exists no program. g : ∀p. p g = ?
My unpopular opinion: His videos are the best on computerphile.
He´s very clear, and the material is both deep and interesting.
he's indescribably funny
I agree, he's def. one of my favorites. I don't think it's such an unpopular opinion!
Eigentlich gut verständlich, aber er verwendet mächtig viele Anglizismen.
KEKW
🤣
lol 😆
Bester Kommentar.
Genau, er soll gleich Deutsch reden
Finally a video on Curry-Howard equivalence! A nice, simple way to introduce it to boot. This was a bit of a mind-blower when I first stumbled upon it and really understood what it meant (being reasonably competent in both mathematics and programming at that point). One minor nitpick, you could've probably named the thing to make it easier to google for.
Such an elegant, potent, and weighty notion and a cornerstone of computer science butchered in this video by a babbling goon.
Relax, dear. You seem to be awfully worked up about this. Drink some chamomile tea and reflect on shortness of life and if you need waste what little time you've got on petty attacks over a video on the Internet.
Hahah nice answer. Thanks for mentioning the scientific term dude.
About Vladimir Voevodsky's work there's an excellent article from Quanta Magazine written in May 2015. It's about replacing the foundation of maths (set theory) by univalent foundations (type theory), and how then that helps to automate maths proofs.
Wow, it's so weird seeing this video after discovering these concepts on my own a few months ago for a project that I have been working on. What I discovered is that predicate logic, algebra and type theory all have a very similar structure:
not ⊂ incrementation ⊂ option type (nullable value)
xor ⊂ addition ⊂ sum type (enum + union)
and ⊂ multiplication ⊂ product type (structs/tuples/cons cell)
implication ⊂ exponentiation ⊂ function type (functions/arrays/dictionaries...)
false ⊃ 0 ⊃ bottom type ("exception")
true ⊃ 1 ⊃ unit type ("void")
true ⊃ ...
true ⊃ ∞/variable ⊃ top type (untyped variable)
From this I also discovered:
- Predicate logic is a restriction of algebra, where every operation is performed modulo 2.
- Algebra is a restriction of type theory, where every operation is performed on the number of possible states each type describes.
- Type theory doesn't need inverse functions like in algebra (decrement, subtraction, division, root, logarithm) because it is completely constructive.
I've tried imagining what hyper-operations after exponentiation would mean, but I couldn't think of anything useful other than that it may be related to mapping lower dimension types to higher dimension types (point => line => circle => sphere => ... n-sphere).
awesome insight :)
void is the bottom type, not exception. Generally, any type that has no elements is an empty type (bottom). The unit type has one element, in Haskell it's ().
@TheFrozenCompass Void is by definition not a unit type. A unit type is a singleton whereas void has no elements; it's a uninhabited set. Haskell has void and unit
Very well explained! Here is a simple Haskell implementation.
{-# LANGUAGE TypeOperators #-}
type P = Bool
type Q = Bool
type R = Bool
type a × b = (a, b)
type a + b = Either a b
f :: P × (Q + R) -> (P × Q) + (P × R)
f (x, Left y) = Left (x, y)
f (x, Right z) = Right (x, z)
Unneeded parens, use precedence ;)
I don't know (any) Haskell. Yet somehow reading your code made more sense than the entire video.
Thanks for this.
Do you also have a Haskell explanation for the “Left of y” and “Right of z” parameters; do they just mean of type y : Q and z : R? If they are all Boolean, why have lefts and rights?
An Either a b type constructor takes two types and return a type which can be either a or b. The values are Left (something of type a) or Right (something of type b)
For example, say we had a Haskell function nthPrime :: Int -> Int which takes an integer n and returns the nth prime. If we decide that if we get a negative number for n (and thus the nth prime is nonsensical) we want to return an error message instead, how would we do that if we said the function returns an Int?
Well, one way is to use Either. Define nthPrime :: Int -> Either Int String. Then if they give us valid input, e.g. n=5 we just return Left 11, to specify that the input was valid and we returned an Int. If the input is invalid we can return Right "Hey, only positive numbers are allowed!".
They don't even have to be different types. You can have an Either String String to represent a valid response vs an error message, which I think pertains more to your question.
Either Bool Bool would let us differentiate between P and Q.
There is no need to prove the statement just for Bool, you can leave it fully parametric:
type a * b = (a,b)
type a + b = Either a b
f :: p * (q+r) -> (p*q) + (p*r)
f (x, Left y) = Left (x, y)
f (x, Right z) = Right (x, z)
I love a scene interrupted by a cute puppy. Totally unexpected in propositions as types!!
8:10
This is a hugely important idea in programming. Great introduction to the idea here! Thanks!
It does seem like an important idea. Shame this professor can't teach it to save his life......
I agree, shame he doesn't speak comprehensible English.
Most often, we write our programs as value-level logic. We gain type-checking if we write them as type-level logic. Type-checking gives confidence and maintainability to programs.
Some good stuff on computerphile lately. You should probably link to the paper, though.
Also, if anyone wants to actually play around with this: Take a look at Agda or Coq. Those are languages specifically made to do this.
Some other functional languages like Haskell and Idris can also do this, but especially in Haskell, it's a bit of effort.
Well, there's still hope we'll eventually get dependent types in Haskell, and then it won't be as much effort.
Indeed. The day we get DependentHaskell will be a glorious one!
I can't wait.
Also Idris! (It has similar syntax to Haskell)
I worked with a language in university, coq, that allows one to write provably correct programs. The syntax being shown is very similar, I wouldn't be surprised if professor Altenkirch is one of the individuals working on it.
coq is developed mainly by the academia in France as far as I know
I didn't fully understand, but this area seems very interesting to me. I am a cs student generally interested in areas relating to mathematical logic so these seems like a perfect fit for me. I will be reading more into this.
I don’t see the problem with his voice, its not hard to understand what he saying.
At first I thought he was saying "two" and "four"... only later it came to me it was actually truth and false.
Well then you must be possessing some superhuman understanding skills. Like an X-Man or stronger.
*It'z not hahd zu undarständ
@@rostislavsvoboda7013 probably has something to do with where you come from. As a portuguese I have no trouble understanding him
Best video from this guy on computerphile so far
This is just fascinating. Top content right here on Computerphile. Thanks for this.
I'm just digging that classic LaserJet with the IrDA interface.
Great video, thanks !
Now that you've watched this, go see the longer talk of prof. Wadler with the same title in a strangeloop conf.
I used to be really exhausted watching his episodes, trying hard to understand when I had had one class of C++ as a mechanical engineer. Now, maybe one semester while getting into programming via different C++ conference talks and other content on programming (particularly Adobe's senior principal scientist Sean Parent's talks have been eye opening), getting introduced to a couple of different languages, I've started to understand more fundamental problems in programming that every language is trying to solve a different way, these are really fascinating topics. How to show that your function is reliable for concurrency and parallelism or how can you try to become confident that it doesn't have a critical bug, perhaps in a setting where you can't really test the function/program. What kind of structures provide confidence.
What I got out of it: If something is not provable it is not classically computable therefore use a mathematical logic that represents the idea of computability for computer science applications?
One thing I have noticed from the comments in this video, is that everyone is from a different background.
Some are programmers, some are mathematicians, some are logicians and some have no background in the things you covered.
Somehow this feels excellently taught but I understood so little of it.
I feel the same. Maybe we need to study more :)
I LOVE PROFESSOR ALTENKIRCH SO SO MUCH
Ah, I love symbolic logic! That is what I specialized in back when I was a philosophy student.
In case no one else noticed, he essentially constructed the category whose objects are propositions and whose morphisms are implications between propositions. Indeed, conjunctions (ands) are products in this category, and disjunctions (ors) are coproducts, what he calls ‘+’ here where it’s basically a type of disjoint union.
This was like playing NES Zelda, walking up to the guy who says, "I am Error," and saying to him, "I'm getting the same message." What did I watch?!
at 0:17, I was almost expecting him to say, '"in kurzgesagt, we could say..."
At 9:00 , I believe he's talking about some people call the `left` and `right` functions as `inl` and `inr`.
This question means the world to me right now if and if could be sufficiently answered or engaged with, would make me very grateful; does this correspondence imply at all that complex types and their algebra can be represented fully with Boolean values and their operators?
Ja?
Space junk Ja Ja Binks
Awesome video. Thanks to all people which makes this type of content possible :)
So basically it is better to use propositions as types than propositions as bools, because if you use propositions as bools, it will check all the possibilities for the statement, but if you use types it will give the proof and then decide either it's true or false. It's faster and more general. Right?
Well, propositions as bools is more powerful, as was demonstrated. However, with propositions as types you can embed this into your programming language and have it machine-checked. You could have the compiler prove that your sorting algorithm does indeed sort the input without having to rely on tests which may or may not find a bug.
Another great thing is that it avoids "boolean blindness". Your proof does not just return "true" or "false" but rather "the statement X is true/false". It contains a reference to what was actually proven. (Also known as evidence)
This allows you to actually use the proof in other proofs, which is rather useful. (Though, with bools, you can't do any of this anyway, so it might a bit unfair to put this on top)
Thank you. I appreciate it.
+Yndostrui I had completely missed the point of the whole video before reading your comment. Thank you so much.
+Yndostrui There exist formal systems in classical logic, that allow you to use proofs in other proofs, as well as those that can be machine checked.
In these aspects, homotopy type theory doesn't do anything new, it has just done it a nice way.
Awesome! I hope you do more videos like this.
Tom Petty - Rock Star Computer Scientist
At 13:11. Symbol "x" appears with three entirely different meanings: as an operator, as the expression that the predicate/function operates on, and as the first of two arguments to the function. (However, "?" is clearly defined.)
I was able to follow the proof, but it would have been easier if different symbols had been used.
Yeah, had to spend a while rereading that bit, thanks to you now I know I did understand him
in Haskell:
f :: (p, Either q r) -> Either (p, q) (p, r)
f (x, Left y) = Left (x, y)
f (x, Right z) = Right (x, z)
does “pigs can fly” refer to Void?
10:50 To prove the tautology he also needs to define the inverse function.
Me a CS Student: "Oh ahah yeah. I know what a programm is. lol"
Me a CS Student after this video: "What actually is a programm?"
Fellow viewer: Do you understand what we's trying to say without previously knowing it?
Felds Liscia
Nope!
It just clicked straight away.
Yes… but I did pause to read the Wikipedia article on Zermelo-Fraenkel set theory. And being familiar with the idea of foundations of mathematics (Peano Arithmetic, Russel's goal of set theory/logic based mathematics), propositional logic, predicate logic and theory of computation probably helped me a lot in understanding it. I can imagine this being kindy tricky if you do not know about these things.
All I saw was the commutative property over and over again. I wasn't sure if he was saying any more than that.
One day, I think Cubical Type Theory (the constructible version of Homotopy Type Theory) will be sufficiently developed to be able to do physics on it, that is, one day they may be able to re-formulate the big theories like quantum mechanics and relativity in cubical type theory and thus produce the so-called "theory of everything" - that is how big type theory is. For now it is a way to make better / more secure API's, and being able to catch bugs at compile time. But in the future type theory will be to our civilization what geometry was to the Greeks, nothing less than our greatest achievement. Mark my words.
What is this topic of mathematics called and where can I read more about it? In particular, I’m curious how it connects with the type systems of functional languages like OCaml.
type theory
I’m confused at minute 15:03 where he’s explaining why the Law of Excluded Middle does not hold in this isomorphism. He says we must decide if all predicates are either true or all predicates are false, but aren’t we showing that all predicates are either true/false? As in the prime number example given, where for some Natural Numbers its going to be true, while for others it will be false. Does anyone have an explanation?
This should be called: Program Algebra (it already exists, check it up)
Uh... it should be called Curry-Howard isomorphism, and it's a thing at least since 1970's.
Can programs handle mathematical induction? If so, what gets in the way of prop. = bool.?
If you assume prop=bool, that means each proposition is either True or False. When we get into programs-as-proofs, we run into one tiny problem. There's a class of propositions that are, within the system, neither True nor False, but unprovable. Reason being, the system only allows for constructive proofs, and not all possible proofs are constructive (also Gödel's theorem, there can be theorems that are, given a set of axioms, unprovable in principle, so classical logic doesn't make all that much sense anyway)
Yes, mathematical induction corresponds to primitive recursion, i.e. to write a function form the natural numbers it is enough to say what the function is doing for 0 and what it returns for n+1 if you assume that you already know what it returns for n. Prop = bool doesn't work because as I have said we cannot interpret the quantifiers as programs on the booleans.
Thorsten Altenkirch Thank you for the answer.
Am I the only one who wanted him to use discriminated unions instead of left and right operators? You can translate from one to the other, of course.
That's actually pretty much how discriminated unions work in Haskell, on the left of = you need to pull it out of the union, and on the right you need to construct one.
I know I'm a few years eight but can someone please caption this video. The concept is super cool but it's pretty difficult to understand his accent when distinguishing important words like "two" and "true".
The explanations use a lot of paper for a computer-phile channel.
it's an interesting topic. but i honestly think the explanation can use some work. some more examples showing the correspondence between proofs and programs would be nice (where all the steps are shown). i also don't think everyone is familiar with functional programming types and proof trees (so a lot of inbetween steps or transformations seem to be missing)
Well, these video's aren't exactly lectures. They are more ideas for you to take interest in.
@@Rockyzach88 this video seems to skip past a lot of the introductions, compared to other videos on this channel. making it harder to take an interest
No comment thread on your article glorifying mathematics. The way mathematics is taught and programming is learned are vastly different. In mathematics, there is a solution and one thousand ways not to get it. In programming, save some cases, there are many ways to devise a solution and only a few ways to not get it. You can be a bad mathematician, but a great programmer. The converse is also true. You don't need to know a single math theory, or a computational one, to solve many computational problems. You only need to hone problem solving and design skills, but practicing mathematics likely helps. Lastly, why reinvent the wheel when copypasta exists?
love the new foo fighters track_ anyone else thoughts
Yo this is sick!!!
Well that takes me back to college...
Why would you need to check the input to a program when you want to always return true? Just return true.
so what you are saying is, that in propositional logic the logical law of excluded middle (A or not A) is unprovable? Makes me wonder, is this a case of Godel's incompleteness theorem, where the statement is true but unprovable? Or maybe, the more worldview-shattering and it actually demonstrates that the law of excluded middle is not always true and boolean logic merely happens to be an area where it is...
Well all roads lead to the Axiom of Choice (via Diaconescu's theorem)
There are a few logic systems where law of excluded middle is unprovable in the general case. Notably, constructive logic (also known as intuitionistic logic) allows having proof for excluded middle on case-by-case basis, but it's impossible to prove in the general case. Some logic systems replace law of excluded middle with negation as failure, meaning if something's impossible to prove to be true, it must be false, and it is a bit of a way out.
I am saying that if we base logic on evidence instead of truth then the principle of excluded middle is not valid. Because in this case we would have to give either supporting or negative evidence for any proposition. And if we use propositions as types then we are not even allowed to look into a type hence it is clearly impossible.
Yes, the axiom of choice also is not justified by the propositions as types explanation (if you do it right). However, Diaconescu proves that the axiom of choice implies excluded middle. The opposite is dose not hold, hence the axiom of choice is actually a stronger non-constructive principle.
+Thorsten Altenkirch Thanks for taking the time to write us informative replies, looking forward to your next video!
Curry-Howard-Isomorphism at work :)
Today I have a midterm on my propositional logic >.
So, what about call-with-current-continuation then?
I wondered what Rick Wakeman has been up to lately, but what's with the accent?
Randgalf wasn’t expecting to read that this far deep in the comments...
is this one of the twins from Matrix?
He is Brad Pitt's cousin.
I think I just blew up a neuron
It's fine, you have 99,999,999,999 left
I never understand any of this guy's videos.
It would help if he opened his mouth when he spoke!
this is still better than GoT
Having experience with functional programming is very beneficial to understanding this.
You can just feel the category theory pulsing under the surface.
It seems to be aimed at an audience that has a small background in mathematical logic already.
Isn't P ∨ ¬P equal to P ← P and thus translates to P → P whichs proof is be the identity function (λx . x) ?
No, in constructive logic ¬¬P is not equivalent to P
But isn't the proposition to be proved (P ∨ ¬P) just plain propositional logic? Or have I missed something?
Constructive logic. (P ∨ ¬P) isn't an axiom, you need to prove it on case-by-case basis. Not provable in general case. Propositions can be "unprovable" (that is, neither "true" nor "false").
Another name for constructive logic is intuitionistic logic. It started with the mathematician Brouwer. It is somewhat weaker than classical logic, where "exclusion of the middle" holds.
import PropTypes from 'propositions-as-types'
didn't expect david grohl to be on computerphile
That's a push. A really, really, really big push.
I kept mishearing "dog" as "doc", and was confused for a bit.
Well you could always have the function return 2 regardless of input. Just discard the input.
To do that you would have to prove that 2 is an element of the return type. But, when the types are variables that isn't possible because it would have to hold for any type. (Parametricity!)
P -> Q means "nothing but P" to "nothing but Q". It could return "nothing but Q" by not returning. Subtle
I don't understand why ~P is P -> empty, and subsequently why he couldnt translate the P v ~P
5:58 often an issue with poorly planned thought experiments and therefore statistics too...or maybe this error is why I'm stuck...
He could've said "kurzgesagt" instead of "in a nutshell" and i would understand.
When did he say "in a nutshell"?? Wait, is he trying to speak English?!?!
Yes..
Brad Pitt 👍🤘🤘
Need subtitles for this one Tooth = proposition of ant?
2B∨¬2B≝? would be the more appropriate statement.
proposischens
Type theory is superior to set theory
ohhhh, thats nice
I've never seen someone look so disappointed about going to the zoo.
I didn't get this one. :/
Look up "Curry-Howard isomorphism", it's a deep concept and usually takes a book chapter to explain fully and clearly.
This happens if nobody compliments Heinrich for dressing himself today!
It occurred to me that computers might have difficulty with things like, "This statement is false". However, one could think of it as saying, "This argument is not what it is" or "1 =/= 1"
Any thoughts?
John Hightower it depends on the meaning you give to "This statement".
let's call "This statement" "f(x)". then "This statement is false" is the proposition:
f(x) = ¬f(x)
then you can give it 2 interpretations
1. if "f(x)" is just a meaningless symbol to you, then you get a simple contradiction as a proposition:
P = ¬P
which simply has the value false
2. but, if you you interpret "¬f(x)" as the definition of "f(x)", then you get an infinite recursion. which computers have problems with of course:
f(x) = ¬f(x) = ¬(¬f(x)) = ¬(¬(¬f(x))) ... etc
in this way, the sentence "This statement is true" has the same infinite recursion problem because of self reference. recursion is supposed to have a decision about whether to continue or end (or it's just an infinite loop). computers have problem with it. because it does not make sense. the program is trying to do something with no hope of an answer. you can see it as computers having difficulty but you could also see it as us having difficulty telling them what to do (it's a blurry line)
for (;;) ;
explains a things a bit yeah?? yeah? :P tbh good video. i liked it
bester deutesche akzent ever
What is a program?
Not so fast, just glancing over the question omits several interesting distictions. For example: is a program a collection of bits and bytes on disk or tape? No, that’s a file (a program file, sometimes even an archive file of a program a la a.out or helloworld.a).
So then, is a program a collection of bits and bytes in memory? Actually, we have a word for that: a process, a task, a thread, a fork etc.
Is a program the abstraction of all of the above? The collection of algorithms in their purest, mathematical form? I don’t think so. It would take it away too far from what’s actually happening in the computer, GUI and bugs and warts and all.
Well, then: what *_is_* a program?
two or false, cant write a pogram for pooposition.
I am a software developer and I do not understand this man's explanations - he may be fantastically intelligent however he needs to brush up on teaching, a first step would be understanding how people have different ways of learning.
You have GOT to be kidding me.
Which version English does he speak...?
How many foreign languages do you speak?
How many breads have you eaten?
it's too complicated
it's future of programming
It's rather straightforward actually. You do need to know some basics of type theory though, so previous prof. Altenkirch's video is recommended (or just read some books on the subject, B. Pierce's "Types and Programming Languages" is a good one if you need a recommendation)
It's actually a lot simpler than how this inept professor tries to explain it.
It's actually very simple. It's just that this dude cannot teach.
If you want to have this explained in a straightforward and understandable way, watch Philip Wadler's "Propositions as Types" from Lambda Days or Strange Loop.
No, no, go back to the doggo!
mumble mumble mumble
That last part didn't make sense.....P+P->0, is that set up to state "p or (p implies 0)" which is always true, or is it "(P+P) implies 0" which is just ¬P. The equations are extremely specific and have a solution, yet the professor makes it seem like it is infinite recursion of some kind?
Implication is simply the formula:
A->B = ¬A∨B
At least in the programming sense.
What you say is correct if we explain propositions as booleans but not if we explain propositions as evidence, i.e. types.
It's *[P + (P -- > 0)]* , and yes he should have included the parentheses. However, your assertion that this statement _"is always true"_ is misguided. The statement is always true in Logic Systems where a proposition cannot be neither true nor false, e.g. classic Boolean Logic aka the "proposition == bool" system. However, the "proposition == type" system is not such a system. It can't be *shown* using the elements of the "proposition == type" system -- that is, using the Cartesian Product, the "OR Plus" by lack of a better name, functions and the Empty Type representing AND, OR, IMPLIES and NOT respectively -- that the proposition *[P OR (NOT P)]* , aka *[P + (P -- > 0)]* , is a tautology.
This is entirely unsurprising, because in the "proposition == type" system, it *isn't* a Tautology. For the most basic example of this, let's examine the Predicate equivalent of the alleged Tautology, which is *[Q: FORALL x, Q(x) \/ ~Q(x)]* . For this statement to be a Tautology, it would have to hold regardless of what "Q" is. Now, if the domain of Q is taken to be the Natural Numbers _[i.e. 'x' is limited to the Natural Numbers]_ and is defined as *[x is Prime]* , then the statement *[FORALL x, Q(x) \/ ~Q(x)]* is indeed a Tautology. However, if instead Q is a Predicate on the Domain of programs -- or Turing Machines if your prefer -- and the definition of Q is *[x Halts]* , then the statement *[FORALL x, Q(x) \/ ~Q(x)]* is NOT a Tautology in Type Theory, as it's undecidable whether a given program eventually halts or not.
This is what the professor meant when he said that *[P \/ ~P]* is not always true in Type Theory / the "proposition == type" System of Logic.
Yes, it's true that *[P -- > Q]* is equivalent to *[~P \/ Q]* within the context of *Boolean Logic!!* But, again, we are not talking about Boolean Logic here, but about Type Theory (Logic). Equating propositions with types quite obviously doesn't assign truth values to propositions and doesn't use Boolean operators to equate to propositional operators. That, AGAIN, would be Boolean Logic. Type Theory / - Logic / whatever you want call "proposition == type" Logic assigns *types* to propositions and uses *type operators* to equate to propositional operators. The type operator that equates to Implication is the Function, which takes an input with the type of the left side of the Implication operator, and outputs something with the type of the right side of the Implication operator.
Moral of the story: you're confusing Boolean Logic with Type Logic. This video is about Type Logic, and that would explain your confusion.
Ahsim Nreiziev You have to be careful about part of that though. Intuitionistic logic indeed does not in general prove "P or (not P)", but it DOES prove "not not (P or (not P))".
"not (P or (not P))" is enough to prove "not P", as given "P" you could prove "P or (not P)", and using the "not (P or (not P))" this would allow you to prove False. So, given "not (P or (not P))" you can derive "not P". Using a similar method, you can also derive "not not P". Take as input "not P", and from that derive "P or (not P)". Then, using "not (P or (not P))" you can derive False. This works to show that "not not P" can be proven, if assuming "not (P or (not P))". So, "not (P or (not P))" is enough to prove both "not P", and "not not P". "not not P" allows you to derive False from "not P", so these two together let you prove that False. So, therefore, from "not (P or (not P))" you can derive False.
This constitutes a proof of "not not (P or (not P))".
Generally, if ordinary logic proves that "P", intuitionistic logic will at least prove "not not P", even if it can't prove "P".
Please tell me I'm not the only one who read the title as "Prostitutions as Types".......
@guy hircshorn I have bad news for you....
You're alone.
Yeah man you have to worrie about it
Yes you are
no, you are
Only you man, I read Propositions.
10th
Firat
Владимир Путин В-пятых?
A proposition is a statement such as "all even numbers divide by 2". By definition it is either true or false. Why are you trying to redefine it as something it isn't? He isn't talking about propositions at all, he is talking about a theorem, which is not the same thing. This isn't a mathematics issue, it is an English issue. Stop trying to redefine the damn language please.
Jason Thacker don't be silly. Theorems are propositions. He is then linking the proposition (the type) with the program (its proof)
He mentioned ∀ being a part of the language, he just used concrete P, Q and R in his example to keep things simple. Here is his example with universal quantification. (The program/proof is the same.)
f : ∀p q r. p×(q+r) → p×q+p×r
f (x, left y) = left (x,y)
f (x, right z) = right (x,z)
Here is a proposition that is false, or type for which there exists no program.
g : ∀p. p
g = ?