Fantastic, Insightful presentation on C++ modules with a review of the tooling landscape. Thanks Daniela Engert for helping the C++ community get onboard the C++ module(s) vision 😇
th-cam.com/video/iMNML689qlU/w-d-xo.html: the thing about modules is that there is one file only to declare and implement a module - the export keyword brings things to the user of a module. There is no module interface unit like the header file was. The drawback is that compilation is now a partially two-pass thing: first all modules need to be compiled unless they are there already as binaries and then all relying stuff is compiled. Makefile won't cut it any longer, CMake has support for it since recently (Version 3.28). And your toolchains need to be able to do dependency scans (e.g. Apple's clang 15 does not support that).
Huh. Yes we can still use separate interface and implementation files. The only reason we scan for dependencies before starting compiling is that compiler implementers wanted to seize the opportunity to improve build time by modularizing precompilation. Think of BMIs has PCHs made modular (this is what clang did a decade ago before standardization).
@@MeetingCPP I'm not paying 230 euro to get a copy of a PDF, and neither is 99.999% of people doing work with C++. The fact that ISO thing exists is inconsequential due to that fact. I'm happy to see gccrs making progress though.
About 7 minutes of hour "dont use it directly, never, its impossible, use build system", ofcourse, we are C++ programmers, we never need to use compiler directly and never understand what it does, as usualy (NO) Modules are really big fail, just compare simple C model with .cpp file + include path => .obj [...obj] => result And C++20 .cpp + include path + ??? what really, may be .cppm or .cpp or something like => .obj || .bmi *rly no one understands what happen next, somehow with several invokes of compiler many .bmi used to create many .obj files and then linked into result* I dont see a reason for header units, private module fragments, parts of header etc, why it is not simple 'okay, now we have module TU' .cpp with export module it in + include path + .bmi path => .bmi .cpp + include path + .bmi path => .obj [...obj] => result Also modules forces compiler to check all code for modules before to build dependency graph, its actually slows compilation (interesting behavior with preprocessor...) Also there are problem: all C++ compilers work only with one file at once, so let we already have module maps etc, compiler compiles .cpp file, it need to know about modules. And i have 1000 instances of compiler each works with its .cpp file + ... loads all .bmi into memory??? I think its possible to optimize (on OS level with read-only files for example), but there are problem, isn't it?
First of all, C++ compilers use the C build model, so anything you say about building C++ is true for C as well. Second, yes, tooling is awful, and I cannot for the life of me understand why toolchains in 2024 do not support aggregation of source files into one compilation unit. The same is true for C, though. So the demands of developers need to be made clear: improve the build tools. By a lot. But that's not a fault of the language.
The BMIs don't need to be loaded entirely into memory thankfully. It contains indexes and parts are loaded lazily. That's why they decided to put the whole standard lib in just one big module for example. But, yes, I think build systems and compilers should me more tightly coupled to avoid starting a compiler driver process for each translation unit.
Fantastic, Insightful presentation on C++ modules with a review of the tooling landscape. Thanks Daniela Engert for helping the C++ community get onboard the C++ module(s) vision 😇
th-cam.com/video/iMNML689qlU/w-d-xo.html: the thing about modules is that there is one file only to declare and implement a module - the export keyword brings things to the user of a module. There is no module interface unit like the header file was. The drawback is that compilation is now a partially two-pass thing: first all modules need to be compiled unless they are there already as binaries and then all relying stuff is compiled. Makefile won't cut it any longer, CMake has support for it since recently (Version 3.28). And your toolchains need to be able to do dependency scans (e.g. Apple's clang 15 does not support that).
Huh. Yes we can still use separate interface and implementation files.
The only reason we scan for dependencies before starting compiling is that compiler implementers wanted to seize the opportunity to improve build time by modularizing precompilation. Think of BMIs has PCHs made modular (this is what clang did a decade ago before standardization).
It seems like its easier to set up crosslinking with rust than modules in c++
Rust doesn't have 3 implementations. And its still early days for module implementations in C++ sadly.
It is quite easy with cargo-zig build.
@@MeetingCPPDoes rust need 3 implementations? Like, is it actually good?
Its the reality for C++ and especially for modules. Rust should have an ISO Standard too.
@@MeetingCPP I'm not paying 230 euro to get a copy of a PDF, and neither is 99.999% of people doing work with C++. The fact that ISO thing exists is inconsequential due to that fact. I'm happy to see gccrs making progress though.
About 7 minutes of hour "dont use it directly, never, its impossible, use build system", ofcourse, we are C++ programmers, we never need to use compiler directly and never understand what it does, as usualy (NO)
Modules are really big fail, just compare simple C model with
.cpp file + include path => .obj
[...obj] => result
And C++20
.cpp + include path + ??? what really, may be .cppm or .cpp or something like => .obj || .bmi
*rly no one understands what happen next, somehow with several invokes of compiler many .bmi used to create many .obj files and then linked into result*
I dont see a reason for header units, private module fragments, parts of header etc, why it is not simple 'okay, now we have module TU'
.cpp with export module it in + include path + .bmi path => .bmi
.cpp + include path + .bmi path => .obj
[...obj] => result
Also modules forces compiler to check all code for modules before to build dependency graph, its actually slows compilation (interesting behavior with preprocessor...)
Also there are problem: all C++ compilers work only with one file at once, so let we already have module maps etc, compiler compiles .cpp file, it need to know about modules. And i have 1000 instances of compiler each works with its .cpp file + ... loads all .bmi into memory??? I think its possible to optimize (on OS level with read-only files for example), but there are problem, isn't it?
First of all, C++ compilers use the C build model, so anything you say about building C++ is true for C as well.
Second, yes, tooling is awful, and I cannot for the life of me understand why toolchains in 2024 do not support aggregation of source files into one compilation unit. The same is true for C, though.
So the demands of developers need to be made clear: improve the build tools. By a lot. But that's not a fault of the language.
The BMIs don't need to be loaded entirely into memory thankfully. It contains indexes and parts are loaded lazily. That's why they decided to put the whole standard lib in just one big module for example.
But, yes, I think build systems and compilers should me more tightly coupled to avoid starting a compiler driver process for each translation unit.