Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm super disappointed with modules. It's 2023, we've pushed out a new c++ standard since modules were standardised, and they're not usable.

They're not supported in cmake (unless you set CMAKE_EXPERIMENTAL_CXX_MODULE_CMAKE_API to 2182bf5c-ef0d-489a-91da-49dbc3090d2a if you're on the same version of cmake as I am).

My experience is that no IDE's support them properly, so either you need to ifdef out for intellisense, or go back to no tooling support.

The format breaks every update, so you can't distribute precompiled modules.

Last but definitely not least, they're slower than headers.

So, we've got a feature that we started talking about 11 years ago, standardised 3 years ago, not implemented fully, has no build system support, and is a downgrade over the existing solution from every real world use case I've seen. What a mess.



Module support will be stabilized in CMake 3.28, actually: https://www.reddit.com/r/cpp/comments/16y9qv2/cmake_c_module...

The other points (mainly integration points) are definitely valid though.


That's really great to hear. Having proper build system support will finally let us use this in anger and report actual issues upstream.


Please also report relevant issues to CMake itself. We can't fix what we don't hear about :) .

FD: CMake developer.


I absolutely will. Im usually not cutting edge enough to catch stuff, by the time I find it it's usually already reported.


> I'm super disappointed with modules. It's 2023, we've pushed out a new c++ standard since modules were standardised, and they're not usable.

Important question: Who are you disappointed in?

Keep in mind that the ISO committee does not employ engineers and for international treaty reasons, it reasonably cannot.

Point being, who should be doing this work? Are you talking to them about your expectations? Are you providing actual support to implementations that are public goods (i.e., open source compilers)?


I have gripes with many parts of the process, predominantly the committee.

> Keep in mind that the ISO committee does not employ engineers and for international treaty reasons, it reasonably cannot.

Engineers not being employed by the committee doesn't meani can't hold them to a standard. The standards committee have shown that they're more interested in freesing popular libraries and putting them in the standard library. It's clear that major changes are preferred as library changes not language changes (ranges should be a language feature, as should span and optional).

> who should be doing this work?

I'm not sure what you're getting at here. The standards committee should have used some time between 2004 (when this was proposed first) and 2019 to get this done. The compiler vendors, 4 years later , should have made this usable. The build systems are at the mercy of the compiler vendors, but cmake has had 3-4? years to get this in.

> Are you talking to them about your expectations? Are you providing actual support to implementations that are public goods (i.e., open source compilers)?

It's not fair for you to suggest that it's my fault this is a mess, 15 years on.

To answer your question, I've tried multiple times over the last 3 years, hit show stopping bugs and issues, and have found them already reported in clang's issue tracker or the VS feedback forums. I've spoken with developers here and on Reddit about issues with MSVC, and cmake.


> It's not fair for you to suggest that it's my fault this is a mess, 15 years on.

All I did was ask specific questions. Specifically, who are you griping about?

It's also not fair to expect ISO, a documentation and consensus building body, to act like an engineering org under contract to us. That's not what it does, and it really couldn't do that if it wanted to.

Reporting, clarifying, and otherwise contributing issues is helpful of course. But bottom line is that someone needs to implement fixes.

I just want to make sure we're all aren't just sitting in a big room yelling about why someone isn't just fixing to things for us. Most of the time it seems like we're not far off from that.

We should be holding our vendors more accountable of course. But also our leadership (like CTO offices) that seem fine using all the open source tech for free without contributing back in any form.

But ISO? I think it is being too complicated for volunteer compiler contributions to keep up with, but I don't think putting more language features in C++ is going to help that at all.

Maybe we call C++ dysfunctional and move on to something else, but I don't see how the same problems don't just show up there in another ten years. It's hard to get around lack of funding. Syntax isn't going to fix funding issues. Consolidation could, but we seem to be going the other direction, at least in the short term.


> s also not fair to expect ISO, a documentation and consensus building body, to act like an engineering org under contract to us. That's not what it does, and it really couldn't do that if it wanted to.

This is a bad faith strawman argument. You're the one who said they're not employees. Again, they can be held accountable for their decisions whether they're employees or not.

> But also our leadership (like CTO offices) that seem fine using all the open source tech for free without contributing back in any form.

Speak for yourself here. Many people (myself included) work at organisations that do contribute back either financially, or by reporting issues and submitting fixes (both of which I've done to our OSS dependencies in the last month). It's not fair of you in one breath to say "I'm just asking questions", "what exactly are _you_ doing to fix this", and immediately follow it with "nobody is contributing anything back". And if your response here is that you weren't talking about me specifically, then you need to decouple your interrogation from your soapbox.

> But ISO? I think it is being too complicated for volunteer compiler contributions to keep up with,

The volunteer compilers like MSVC (which is proprietary), gcc (which based on last time I looked is primarily developed by employees from red hat and IBM), or Clang which has been massively contributed to by apple and Google up until very recently?

> s hard to get around lack of funding. Syntax isn't going to fix funding issues.

I'm not sure what you're getting at here, at all.

> Consolidation could, but we seem to be going the other direction, at least in the short term.

Agreed here, unfortunately. We've had a decade plus of consolidation, and what we've ended up with is a camel (IMO)


> ranges should be a language feature, as should span and optional

That’s an opinion. As you notice, “It's clear that major changes are preferred as library changes not language changes”

The spirit of C/C++ is to put just enough in the language itself to allow programmers to implement such things in them.


> That’s an opinion.

That's fair, actually.

It's an opinion I feel strongly about, though. Honestly, I think c++'s reliance on library features to patch over deficiencies in the language is a cop out. We knew about the unique pointer abi performance issue a decade ago, and decided that string view should be implemented the same way.

Ranges are "the kitchen sink", and cause significant compile time issues whether you want to use them or not.

Spending time on libraries like fmt (which is an excellent library) is paying lip service to progress by locking in a 5 year old library (at this point).

> The spirit of C/C++ is to put just enough in the language itself to allow programmers to implement such things in them

To use your own words against you, that's an opinion. Mine is the spirit of C++ (which were talking about here, not C) is to not rock the boat too hard or everyone will have an opinion.


The ISO committee should not be in the business of inventing language features out of whole cloth. That is not how “standardization” works. They should only standardize existing practice.


> The ISO committee should not be in the business of inventing language features out of whole cloth. That is not how “standardization” works. They should only standardize existing practice.

How do you expect to standardize a feature that does not exist yet but the whole community is demanding?

It sounds like the C++ standardization committee designed a feature following a process where all stakeholders had a say. Is this worse than being force-fed a vendor-specific solution?


> How do you expect to standardize a feature that does not exist yet but the whole community is demanding?

Implement it in a popular compiler or library.

I know we are talking about C++, but the ISO C committee, over the past several decades, has repeatedly ignored working, good features in GCC and invented crap in their place.


This is part of the modules mess in the first place. Modules were implemented in MSVC and clang, and neither agreed with the others implementation.

Also if MSVC came along with an "extension" to c++, I can only imagine the uproar suggesting that this is now step 2 of EEE.


I do not expect standardization of features that do not exist.


> I do not expect standardization of features that do not exist.

But that makes absolutely no sense, doesn't it? I mean, think about it for a moment. Specifying a standard behavior is not about picking a flawed winner from a list of ad-hoc implementation or settle with the least common denominator. Specifying a standard behavior is about providing the best possible solution that meets the design requirements following the input from the whole community.


My expectation is that the ISO C and ISO C++ committees recognize that things would be best off if they dispersed.


The progress is frustratingly slow. My understanding is GCC and Clang still haven't finished implementing them fully. Last I read they are still making significant changes, for example moving to the strong ownership model for symbols. Once they're done, hopefully the build system and IDE support will follow quickly. MSVC seems to be in much better shape.


The progress is mostly unfunded. It's not that the work is especially slow (maybe it is a little?). Mostly it's that barely anyone is working on it.

This is especially frustrating for organizations ostensibly paying vendors for high quality compilers.


It's frustrating because consider how many hours are lost to compile times over the whole industry.

I have no idea how many people are slopping C++ code. So lets just calculate it per 1000. Assume modules save an average of 10 minutes a day.

1000 X 250 days/year X 0.167 hours/day -> 41,666 hours a year per thousand coders. Or 21 man years.

Yeah you'd think it'd be worth funding heavily.


When you use a good build system like Bazel modules do not save 10 minutes a day. The time saving is negligible. That's why large C++ shops do not care about modules and only volunteers are working on them.


Not everyone uses bazel. I would wager that a vanishingly small amount of people use bazel. If people using other things was enough to not let things be standardized we wouldn't have asio, ranges, fmt. Precompiled headers also exist, so why would we standardise modules?


They are not slower than headers. Ive been looking into it because modular STL is such a big win. On my little toy project i have .cpp files compiling in 0.05 seconds while doing import std.

Downside is that at the moment you cant mix normal header STL with module STL in the same project (msvc), so its for cleanroom small projects only. I expect the second you can reliably use that almost everyone will switch overnight just from how fast of a speed boost it gives on the STL vs even precompiled headers.


The one way in which they are slower than headers is that they create longer dependency chains of translation units, whereas with headers you unlock more parallelism at the beginning of the build process, but much of it is duplicated work.


Every post or exploration of modules (including this one) has found that modules are slower to compile.

> I expect the second you can reliably use that almost everyone will switch overnight just from how fast of a speed boost it gives on the STL vs even precompiled headers.

I look forward to that day, but it feels like we're a while off it yet


I assume templates can only be partly preprocessed (parsed?) but not fully pre compiled, since final code depends on the template types?


Depends on the compiler, clang is able to pre-instantiate templates and generate debug info as part of its pch system - (for instance most likely you have some std::vector<int> which can be instantiated somewhere in a transitively included header).

In my projects enabling the relevant flags gave pretty nice speedups.


Yes, but template code is all on headers, so it gets parsed every single time its included on some compile unit. With modules this only happens once so its a huge speed upgrade in pretty much all cases.


Whenever I’ve profiled compile times, parsing accounts for relatively little of the time, while the vast majority of the time is spent in the optimizer.

So at least for my projects it’s a modest (maybe 10-20%) speed up, not the order of magnitude speed up I was hoping for.

Thus C++ compile times will remain abysmal.


For some template heavy code bases I’ve been in, going to PCH has cut my compile times to less than half. I assume modules will have a similar benefit in those particular repositories, but obviously YMMV


That's the pinacle of C++ development !


It's still brand new, in C++ feature terms. I'm not surprised by, and would totally expect, all of the problems you cited.


I think we've been talking about this and trying to implement it for long enough that "it's still new" stopped being an excuse 2 years ago.


> The format breaks every update, so you can't distribute precompiled modules.

When will they fix this? People keep using C to this day because these unstable binary interfaces make it so no one can reuse software. Can't write a library in almost any other language and expect it to just work. To this day, C is the lowest common denominator of software because of stuff like this.


Unlikely to be fixed. The module files are basically AST dumps and no compiler keeps that stable over time.

Even flag selections make incompatible BMIs. So if one TU uses C++20 and another uses C++23, they'll each need their own BMI of anything they import.

FD: CMake developer that implemented the modules support


If module files are pretty much memory dumps that makes it all the more frustrating that it's taken this long, given thats what precompiled headers essentially are.


There are new rules about "reachability" and "module linkage". Not to mention things like methods defined within the class declaration are no longer as-if `inline` when within a module. It's not just "a faster preprocessor" mode; there are real rules associated with module boundaries.


C++ has stable binary interfaces just like C.


I've seen compiled C++ code turn out to be incompatible with code produced by different versions of the same compiler. There is no way this is stable.


Most compiler vendors promise forward compatible stability and have done for a while now. You're never going to have perfect compatibility, as I can always just compile with -DMY_FLAG=1 and change the behaviour of everything. But, GCC and clang haven't had a breaking change in a decade, and I believe msvc has been compatible since 2015.


My feeling is that the desire for it has somewhat waned because so many people that care about the elegance of programming languages and use C++ have just moved to Rust. There are still plenty of people using C++ of course, but it certainly feels like more of a dead end than it did before Rust. Why bother putting a mountain of effort into maybe slightly improving it when in 10-20 years you won't be using it anyway?


The number of Rust developers is a drop in the ocean compared with C++ developers. There are more than 5 million C++ developers out there starting new C++ projects every single day. I am one of them.


Not that I disagree that there are a large number of C++ programmers out there, but where did you get that number?


Yeah obviously that's true now because C++ has been around for literally decades and had basically 0 competition for all that time.


And it will continue to be true for a very long time since the world runs on software written in C and C++ and that needs to be maintained and improved.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: