@@curio78 hearing from whom? Also, some features already in preview in preparation for valhalla. There is a prototype JDK version they released which I've tested.
@@curio78 saying they haven't figured out serialisation based on the talk is quite a misnomer. Even the talk doesn't say that. You seem to have misinterpreted. They already figured it out, and another talk was published a few hours ago which highlights the path for value objects will take in serialisation, which is through strict initialisation. The approach uses the constructor of that value type similar to how records do serialisation but with further strictness, and now the developer specifies the default constructor to use.
@@curio78 Okay, then explain more how brain storming should work and how they are doing trial and error! It's easy to type paragraphs than refactoring the whole jvm to provide these features. "No I heard him speak". I didn't hear him speak about it in the video. Wait. Out of curiosity. Which programming language do you use mainly? I'm suspecting it is a specific one that I tend to see this kind of negative discussion.
Top 5 things that I am incredibly excited about in java: Type enforced nullability (thanks Valhalla) Project Valhalla Project Valhalla Project Valhalla Project Valhalla
Brian and his Valhalla team have just discovered the E=mc² of Java. It really is so beautifully elegant. Now they have a little matter of general and special relativity to iron out and then they’re done. I’ve been watching this space for the last 10 years and am super impressed with how clean and simple you’ve made it. Well done!
Great insights what is under the hood ... and how the improvements are thoroughly thought through and coming step-by-step into the language... Great thanks to all people who are working on that... with so much passion and with so much attention to details, which in the end leaves a very rounded impression in the language.
I love how this is the first JVMLS video I can't watch now but have to wait for. Edit: To be clear, this comment is written in good fun. On a more serious note I'm thoroughly impressed with the work the Java team has done in the past few years and as a long time Java dev, I appreciate and much prefer the "get it right" approach and mentality.
Wow, I am impressed! There obviously are genial and brave design decisions involve. This seems to be another huge step forward. Maybe we can see a bunch of this already with next LTS Java25?
I actually did ask for stricter initialization, but flexible pre-initialization is much better than the constant, static initialization I originally asked for. Great work, I'm very excited to see Valhalla in the standard JDK!
I am somewhat disappointed that "integrity first" means we don't get flattening for complex values unless we make our code messy with explicit annotations/interfaces/whatever. If the language has a volatile keyword specifically to designate things that are intended to be updated from multiple threads, why isn't everything else that doesn't have this attribute allowed to tear? It seems like a strange choice to disable all these optimizations just because someone out there may be writing poorly protected multithreaded code.
Sorry, so what happens to `ArrayList`? From my watching of this video, it seems like I will be able to write `ArrayList` and get a lowering optimization that will back the resulting instance with something like more like an `int[]` instead of an `Object[]`, correct?
But with for example an Integer![10000] would I need to explicitly assign default values to the whole array? Instead of just letting the JVM assign the default of all zeros? Wouldn't that be really inefficient to do all of that allocation? It seems like there should be a handful of built-in static methods for allocating "all zeros" or "all empty strings" or whatever, very efficiently.
My opinion on operator overloading is that there should be no operator overloading in Java. My reasoning is that operator overloading reduces clarity without bringing any material benefits (syntax sugar notwithstanding). Operator overloading is confusing (what does "+" mean ?) and incomplete (there are far more operators needed than are present in any language). And I fail to see the benefit of writing "p1 + p2" instead of "p1.add( p2 )" for anything other than pre-defined primitive types. Others may have a different opinion, I am fine with that, but "no operator overloading in Java" is mine.
Clearly you probably never tried to read any moderately complex code dealing with vectors or BigInteger/BigDecimal. It's fine for simple "a + b" but then consider a relatively simple expression like this: var result = x*(ax-bx) + y*(ay-by) + z*(az-bz) + someConstant; Now with no operator overloading: var result = x.mul(ax.sub(bx)).add(y.mul(ay.sub(by))).add(z.mul(az.sub(bz))).add(someConstant); You can't possibly tell me that this sort of mess is somehow easier to understand than the expression with operators. You may be able to format it so that it's at least somewhat readable (though very unconventional) but nowhere near the original: var result = ( ( x ) .mul ( (ax) .sub (bx) ) ) .add( ( y ) .mul ( (ay) .sub (by) ) ) .add( ( z ) .mul ( (az) .sub (bz) ) ).add(someConstant); And "there aren't enough to cover all of math" is not a good justification for not doing it at all. Honestly, I think a simple restriction of "operator overloading only for value classes" would already at least strongly suggest that operators shouldn't mutate the state of an object. And I think the biggest danger of operator overloading is an operator mutating an object instead of giving a new one as a result. Otherwise I really see no difference between "what does a + mean?" vs "what does a .add mean?". In both cases a reasonable IDE should be able to show you what either of them does, possibly even highlight overloaded operators differently.
First, Vector is a problem that was solved eons ago with: * Vector * Auto-Boxing & Auto-Un-Boxing Second, To me, the difference between an Object and a Primitive is like the difference between an Apple and an Orange. Furthermore, the most applicable / usable Value-Classes (Integer, Long, Float, etc..) have already been rolled out ages ago, right? What am I missing? Classes that have 'final' declared Data-Fields could all be converted to 'Value-Classes' but how much "Value" or "Savings" (and in terms of what metrics) are being achieved with this idea? It seems to me like this idea would add an infinite amount of complexity (on the Oracle side of the Coding-Fence), but I don't understand the gains. Classes with "Low-Amounts of Data", and "final-constant" data could be declared "Value" Classes? But are there actual and realizable speed-improvements? Is the Garbage-Collector operating more efficiently?
I'm going to keep harping on this: Why are you making value class fields final! If you persist with that design decision then it no longer "codes like a class" and I don't see the benefit. Why can't you allow fields to be declared as final or not? I believe that is how C# implements value classes.
Because making fields final, thus ensuring there will be at most one write on that field you allow the JVM to do 1) more and better flattening. 2) more optimizations under the hood. That's what Valhalla is all about. If you need your fields to mutate then there is no point in using value classes/records since the JVM is not gonna be able to optimize memory and performance as much as it could, but you would still pay the trade off of lack of identity, so Why would you use a value Class/Record then?
@@Ewig_Luftenglanz How does making fields final provide better "flattening"? Data oriented models requires controlling memory layout and mutability. Laying out structs in a vector is a common pattern. Creating your own arenas. The whole point of value classes is controlling memory allocation and size. The feature is the optimization.
@@Ewig_Luftenglanz Thanks for the response. I don't know the particulars of JVM optimization, but I do know raw coding in C and C++, and even FORTRAN back in the day. In the end it comes down to allocating raw memory blocks and navigating that memory with address pointers. Somewhere there has to be a pointer to the value object, and then you get to it's fields relative to that starting pointer. I did a lot of HPC engineering programming and in that area I like to allocate memory to hold massive amounts of input data and output data, and state data that get's changed every iteration of the solvers. Having that memory clustered together in memory reduces paging and other performance robbing tasks. I was really excited about the flattening aspect of value objects, but if I can't change them then I can't allocate that memory ahead of time and use them for output and state data.
@@Ewig_Luftenglanz This is a follow up to my last comment, I'm just thinking out loud. To achieve flattening/clustering of objects in memory why don't you implement an array allocator that instantiates all of the objects in the array when the array is allocated? That way all objects in the array can be created in one continuous memory block. // Suggestion Point[] array = new new Point[1000000]; // Create Point array, instantiate all objects in continuous memory
@@sstevenson638 if they went with zero initialization instead of strict initialization, new Point[100000], would just be zeroed out memory. No need to initialize anything from the JVM side of things. No hidden code running. Since value classes are immutable and require strict initialization, creating an array like that gets more complicated. It is a bad design.
I can clearly see that Java sunset is getting faster. Maybe my comment will be funny, so hold your laugh until it happens, then we will laugh together my friends.
I hope I can use Java with Valhalla before getting to Valhalla
Lol 😂
we all hope so
"No new bytecodes, no new type descriptors and no new constant pool forms". Damn. That's epic research and design approach for Valhalla.
@@curio78 hearing from whom? Also, some features already in preview in preparation for valhalla. There is a prototype JDK version they released which I've tested.
@@curio78 saying they haven't figured out serialisation based on the talk is quite a misnomer. Even the talk doesn't say that. You seem to have misinterpreted. They already figured it out, and another talk was published a few hours ago which highlights the path for value objects will take in serialisation, which is through strict initialisation. The approach uses the constructor of that value type similar to how records do serialisation but with further strictness, and now the developer specifies the default constructor to use.
@@curio78 Okay, then explain more how brain storming should work and how they are doing trial and error! It's easy to type paragraphs than refactoring the whole jvm to provide these features.
"No I heard him speak". I didn't hear him speak about it in the video.
Wait. Out of curiosity. Which programming language do you use mainly? I'm suspecting it is a specific one that I tend to see this kind of negative discussion.
@@curio78 then this part was meant for people like you 8:56
Top 5 things that I am incredibly excited about in java:
Type enforced nullability (thanks Valhalla)
Project Valhalla
Project Valhalla
Project Valhalla
Project Valhalla
Hm why would tearing be a problem for immutable and copied values…
Brian and his Valhalla team have just discovered the E=mc² of Java. It really is so beautifully elegant. Now they have a little matter of general and special relativity to iron out and then they’re done. I’ve been watching this space for the last 10 years and am super impressed with how clean and simple you’ve made it. Well done!
After years of waiting there is a light at the end of the tunnel. Looks really neat. Kudos to all the engineers and researchers ❤
Great insights what is under the hood ... and how the improvements are thoroughly thought through and coming step-by-step into the language... Great thanks to all people who are working on that... with so much passion and with so much attention to details, which in the end leaves a very rounded impression in the language.
incredible how clean the suggested design is. awesome work - thanks a lot. can't wait to see this in production.
I love how this is the first JVMLS video I can't watch now but have to wait for.
Edit: To be clear, this comment is written in good fun. On a more serious note I'm thoroughly impressed with the work the Java team has done in the past few years and as a long time Java dev, I appreciate and much prefer the "get it right" approach and mentality.
Thank you for a sensible track on operator overloading!
Thank you for make me love Java a little bit more, it is really nice to see the process of modernization from inside and learn from your experiences.
Wow, I am impressed! There obviously are genial and brave design decisions involve. This seems to be another huge step forward. Maybe we can see a bunch of this already with next LTS Java25?
Java 26+
The architect of the best programming language will talk about the best upcoming feature
I actually did ask for stricter initialization, but flexible pre-initialization is much better than the constant, static initialization I originally asked for. Great work, I'm very excited to see Valhalla in the standard JDK!
Hoping for a release before year 2100
soon, brother
@@mdtanveerhasan1453 which version can i expect to have this?
@@sadiulhakim7814 Java 26+
@@sadiulhakim7814 Java 25+
@@sadiulhakim7814Java 32
This is what happens when you let features bake, it's why Java is my favorite language.
Because it's always 10y behind? I should opt for the same strategy at work and see what happens. "Where is the feature?", "it's baking"
Great stuff! Can we declare a class non-nullable by default to save people having to use "!" everywhere?
Probably not, cause it needs to be backwards compatible
I am excited!
Nice job!
I am somewhat disappointed that "integrity first" means we don't get flattening for complex values unless we make our code messy with explicit annotations/interfaces/whatever.
If the language has a volatile keyword specifically to designate things that are intended to be updated from multiple threads, why isn't everything else that doesn't have this attribute allowed to tear? It seems like a strange choice to disable all these optimizations just because someone out there may be writing poorly protected multithreaded code.
Sorry, so what happens to `ArrayList`? From my watching of this video, it seems like I will be able to write `ArrayList` and get a lowering optimization that will back the resulting instance with something like more like an `int[]` instead of an `Object[]`, correct?
I wonder how the internal array of a new ArrayList() is gonna be initialised
Bois... It is time to join the Valhalla!
VALHALLA!!!
they know how to tease us lol
Wake me up when we get there !!!
In our lifetime, we might never be able to use Valhalla. Our next generation can. Enterprise still stuck with Java 8 anyway.
But with for example an Integer![10000] would I need to explicitly assign default values to the whole array? Instead of just letting the JVM assign the default of all zeros? Wouldn't that be really inefficient to do all of that allocation? It seems like there should be a handful of built-in static methods for allocating "all zeros" or "all empty strings" or whatever, very efficiently.
Hope we got a oreview for OpenJDK 24, that would be great 👍
Well Done
Can't wait to create leetcode code that you can actually read, without primitive juggling. Keep the great job 🙏
Can't wait for project lilliput, I hope Java memory usage can be reduced by half
Use GraalVM for that.
@@pompiuses Nope, Graal have many limitation because of closed world nature. Whats the point of lilliput, if it can't do that
@@altgamz1072 Lilliput will just be a drop in the bucket.
@@pompiuses I agree with altgamz, it should be able to reduce the memory usage at the very least 20-30%
Half ??? Reducing header size by half won't reduce object size by half. Lilluput expect between 10% to 20% reduction in size in average.
End of an era.
Can I get the slides from somewhere to download it? I wanna show them on work
2030 Java is gonna be cool again
Now let’s hope this stuff doesn’t need another few years of ripening in preview after preview.
🎉
Now let's see who wins the race: Project Valhalla, or string interpolation?
Folks, I'm writing a game engine in Java. I'm following and waiting for this project to get out. Please end my suffering
My opinion on operator overloading is that there should be no operator overloading in Java. My reasoning is that operator overloading reduces clarity without bringing any material benefits (syntax sugar notwithstanding). Operator overloading is confusing (what does "+" mean ?) and incomplete (there are far more operators needed than are present in any language). And I fail to see the benefit of writing "p1 + p2" instead of "p1.add( p2 )" for anything other than pre-defined primitive types. Others may have a different opinion, I am fine with that, but "no operator overloading in Java" is mine.
Clearly you probably never tried to read any moderately complex code dealing with vectors or BigInteger/BigDecimal. It's fine for simple "a + b" but then consider a relatively simple expression like this:
var result = x*(ax-bx) + y*(ay-by) + z*(az-bz) + someConstant;
Now with no operator overloading:
var result = x.mul(ax.sub(bx)).add(y.mul(ay.sub(by))).add(z.mul(az.sub(bz))).add(someConstant);
You can't possibly tell me that this sort of mess is somehow easier to understand than the expression with operators. You may be able to format it so that it's at least somewhat readable (though very unconventional) but nowhere near the original:
var result = ( ( x ) .mul ( (ax) .sub (bx) ) )
.add( ( y ) .mul ( (ay) .sub (by) ) )
.add( ( z ) .mul ( (az) .sub (bz) ) ).add(someConstant);
And "there aren't enough to cover all of math" is not a good justification for not doing it at all.
Honestly, I think a simple restriction of "operator overloading only for value classes" would already at least strongly suggest that operators shouldn't mutate the state of an object. And I think the biggest danger of operator overloading is an operator mutating an object instead of giving a new one as a result. Otherwise I really see no difference between "what does a + mean?" vs "what does a .add mean?". In both cases a reasonable IDE should be able to show you what either of them does, possibly even highlight overloaded operators differently.
C# about to be BTFO
Why?
C# can now at CoreRT, so C# is yet better than Java.
First, Vector is a problem that was solved eons ago with:
* Vector
* Auto-Boxing & Auto-Un-Boxing
Second, To me, the difference between an Object and a Primitive is like the difference between an Apple and an Orange. Furthermore, the most applicable / usable Value-Classes (Integer, Long, Float, etc..) have already been rolled out ages ago, right? What am I missing?
Classes that have 'final' declared Data-Fields could all be converted to 'Value-Classes' but how much "Value" or "Savings" (and in terms of what metrics) are being achieved with this idea? It seems to me like this idea would add an infinite amount of complexity (on the Oracle side of the Coding-Fence), but I don't understand the gains. Classes with "Low-Amounts of Data", and "final-constant" data could be declared "Value" Classes? But are there actual and realizable speed-improvements? Is the Garbage-Collector operating more efficiently?
So... Implementation details aside, the only change to the language design is: rename T.val to T!, and rename T.ref to T?
Just give us Valhalla already wtf is wrong with you.
Tbh, if you repeat that comment 9 more times, Valhalla would be in your hands already.
@nipafx
Valhalla would be in your *PANTS* already if you repeat that comment 9 more times.☺
I'm going to keep harping on this: Why are you making value class fields final! If you persist with that design decision then it no longer "codes like a class" and I don't see the benefit. Why can't you allow fields to be declared as final or not? I believe that is how C# implements value classes.
Because making fields final, thus ensuring there will be at most one write on that field you allow the JVM to do
1) more and better flattening.
2) more optimizations under the hood.
That's what Valhalla is all about.
If you need your fields to mutate then there is no point in using value classes/records since the JVM is not gonna be able to optimize memory and performance as much as it could, but you would still pay the trade off of lack of identity, so Why would you use a value Class/Record then?
@@Ewig_Luftenglanz How does making fields final provide better "flattening"? Data oriented models requires controlling memory layout and mutability. Laying out structs in a vector is a common pattern. Creating your own arenas. The whole point of value classes is controlling memory allocation and size. The feature is the optimization.
@@Ewig_Luftenglanz Thanks for the response. I don't know the particulars of JVM optimization, but I do know raw coding in C and C++, and even FORTRAN back in the day. In the end it comes down to allocating raw memory blocks and navigating that memory with address pointers. Somewhere there has to be a pointer to the value object, and then you get to it's fields relative to that starting pointer. I did a lot of HPC engineering programming and in that area I like to allocate memory to hold massive amounts of input data and output data, and state data that get's changed every iteration of the solvers. Having that memory clustered together in memory reduces paging and other performance robbing tasks. I was really excited about the flattening aspect of value objects, but if I can't change them then I can't allocate that memory ahead of time and use them for output and state data.
@@Ewig_Luftenglanz This is a follow up to my last comment, I'm just thinking out loud. To achieve flattening/clustering of objects in memory why don't you implement an array allocator that instantiates all of the objects in the array when the array is allocated? That way all objects in the array can be created in one continuous memory block.
// Suggestion
Point[] array = new new Point[1000000]; // Create Point array, instantiate all objects in continuous memory
@@sstevenson638 if they went with zero initialization instead of strict initialization, new Point[100000], would just be zeroed out memory. No need to initialize anything from the JVM side of things. No hidden code running.
Since value classes are immutable and require strict initialization, creating an array like that gets more complicated. It is a bad design.
I can clearly see that Java sunset is getting faster. Maybe my comment will be funny, so hold your laugh until it happens, then we will laugh together my friends.
But why?
@@ahuramazda9202 please elaborate