My //build/ talk on Friday @ noon PDT (webcast)

The session schedule for this week’s //build/ conference in San Francisco has now been posted.

I have a talk on Friday at noon Pacific time, titled “The Future of C++.” Note this is a Microsoft conference, so the talk is specifically about the future of the Visual C++ product, but nevertheless it’s all about Standard C++ because I’ll start with a short update on ISO goings-on and the bulk of the talk will be an update on standards conformance in Visual C++ and explaining a number of the most modern ISO C++ features.

On Friday, you can watch my talk live at Channel 9. In the meantime, you can get the keynotes and some other major sessions at the same link all day today, tomorrow, and Friday… as I write this, a cool guy named Steve has the camera and just gave away thousands of nifty 8″ tablets. I’ll be in the same place in two days to talk C++.

If you’re in San Francisco for //build/ and care about C++, you want to be in South Hall: Gateway Ballroom for session 2-306.

If you’re not at the conference but use Visual C++ as one of your C++ compilers, you’ll want to watch the talk live in the webcast or on demand about 24-48 hours after the talk ends.

Even if you don’t use Visual C++ a lot right now, you might find some of the ISO C++ standards context and updates interesting.

Stay tuned.

29 thoughts on “My //build/ talk on Friday @ noon PDT (webcast)

  1. Will the talk cover the awkward question to “what happened to the promise of out-of-band compiler updates”? You were pretty vocal a year ago about how we all should just “wait and see” all the C++11 improvements that were going to be delivered out-of-band for VS2012. ;)

  2. At least the example code from your own slides (The C++ Concurrency Talk) also works now with VS. Thanx to finally working unified initializers ;). But I agree with Jay, all this was promised to be in VS2012 via free updates.

  3. It was a very interesting talk, even for a non-Microsoft developer. Thanks for the heads-up!

    The Channel 9 video production was slick, but somewhat thoughtless. Somebody has too much fun pressing those video manipulation buttons and choosing cool camera angles. Far too many times, while code was being discussed, the code was obscured or totally invisible.

    Regards,

  4. Thanks for the info. It’s great to hear that there are at least plans to fix some of the more serious issues others have, such as the getting the two-phase look up and ‘more fixed’ C header file preprocessing and all!

  5. I just watched the talk and I’m looking forward to the plans shown for VC++.

    On the question about how widely applicable deduced return types will be, I thought I’d point out that once modules are adopted it will be useful not just for template functions and other functions that appear in headers. Since modules, as described in Douglas Gregor’s presentations, allow for translation units to import declarations exported by other translation units without headers, one will be able to define a normal function with a deduced type, and whatever the deduced signature is will be importable into other translation units.

    Modules will be one of those added features that make C++ simpler,

  6. Oh, and two more things; You mentioned that VC++ hasn’t been using ASTs and that’s why it hasn’t been able to implement features like two-phase look up. Is there someplace I can read more about that?

    Secondly, I and people who’ve been complaining about non-conforming behavior will be happy to have two-phase lookup, as well as any other fixes we can get, but I wonder how loud the complaints will be from others who just want their legacy code to keep working with the new compiler?

  7. It is nice that the mssing features are comming. And it is great, what is comming more. I personally am a little dissapointed that the new unions come so late, but I understand that the majority of users would profit more from the other features.
    But two things are not clear for me: You said, that the VS 2013 Preview is ready to be used in production, in commercial applications? When do you plan to start shipping the new VS? Is it worth to wait until that time or do one get’s a free upgrade when one orderes now the VS 2012 version?

  8. Slight off-topic: Herb, could you please tell me if you’re still hiring? I mean the C++ team, C++ AMP, Casablanca, library and compiler teams, etc…

  9. Given the promises that were made in regard to MSVC++ 2012 updates, given the current situation, how many Visual Studio versions do you think we are willing to pay for, or even trust that what has been communicated today will not change as well?

  10. Ok, so several parts of the talk caught me by surprise: fixing two-phase lookup and adding (some) C99 functionality? I really didn’t see that coming. (If I wanted to be mean, I would say that I also never expected your team to actually *finish* variadic templates, given the delays so far, but hey, that would be mean :))

    Anyway, I think those improvements are much needed, so thank you for that. :)

    However, as others above have said, it’s hard not to be concerned about a few things:

    – you’ve promised that *this* release would be different several times before. Remember the paradise of out-of-band feature updates we were promised for VS2012? Remember how you said back in 2010 that *now* your team was going to take C++11 seriously, after which… pretty much nothing happened? This roadmap looks very impressive, but can you deliver? (Or, again putting on my cynic hat, if you haven’t been able to deliver for the past 4 years, what is going to make the next year or two different?)

    – the faster release schedule seems to mean we’ll have to pay up twice as often, but what does it get *us*? It might reduce the latency from a feature is developed and until it is made available to us, sure, but I fail to see how it does anything to increase the throughput, and let’s be honest, the MSVC team’s throughput in terms of C++11 features has not been impressive so far. So we have to pay up more often (in a world where nearly every IDE and every compiler is available for free, I might add), but for what? Is that somehow key to the plan to catch up on C++ conformance? Why couldn’t (some of) these new features be shipped as free updates? (and no, I don’t buy the argument that simply putting them behind a paywall somehow makes them less risky or less likely to break existing code. Supporting and maintaining the updates, and making them optional reduces that risk. The paywall does not. That just seems like short-sighted greed)

    Anyway, I am impressed at the ambitions and intentions highlighted in your talk. I hope that *this time*, your team can deliver on your promises. :)

  11. Does a promise of CTP for VS 2013 means the same as the last time? A tease of what will come in VS 2014? I bought VS 2012 because you promised “out of bands” C++ compiler updates. 3 updates for and no single feature added from Nov CTP. And now for some reason we suddenly need a new VC just a year after previous release. It used to be 2 years at least. Please don’t promise what you can’t keep.

  12. @Herb, can you please make a statement in no uncertain terms, with your credibility on the line, as to the exact nature of updates (+out of band) including language features, libraries and bug fixes for this recent release.

    I’m not asking you to promise great things, but as we are considering purchasing new licenses, we would like to know if we should:
    1. Go ahead and pay up
    2. Hold off for a little while and see what things are actually added/updated in 2013
    3. Simply move across to the Intel tool chain.

    We need to make a decision urgently.

  13. In the presentation, you said current VC does not generate complete AST. In my knowledge, there was an similar attempt to re-enginner compiler codebase by using AST. ( http://blogs.msdn.com/b/vcblog/archive/2006/08/16/702823.aspx ) Especially, I was interested in the plan to provide AST API – though it has not been delivered to user. Do you have any similar plan to expose programmable AST API to user?

  14. @Jay: I hope I didn’t give that impression last year, and I chose my words carefully to try to be very precise. I think you’re combining statements from two different talks:

    First, in Feb 2012 at GoingNative, we said we’d ship conformance features in “out-of-band” CTP and RTM updates faster than VS’s 2-3 year release cycle; for the CTP part, we did so by shipping the Nov 2012 CTP (and soon another one), and for the RTM stream, it turned out later that all of VS moved to a single-year release cycle so we didn’t need to do anything more than snap to that train and release Preview this month to get an RTM go-live compiler with new conformance updates, which we did. I think one place the confusion got started was that people took the “out of band” from this talk and connected the dots to the VS Updates that came later, but we didn’t even know about the Updates at this time.

    Then, in Nov 2012 at Build, we shipped the November CTP and announced that we planned “more batches” in the first half of 2013. And we missed the “more batches” promise — we did ship a batch this week at RTM quality in VS 2013 Preview, but “batches” is plural and that ended up being the only batch, so we missed. What we had intended was to ship an additional CTP in the spring, which didn’t happen mainly because VS 2013 ended up materializing on a short release cycle so we concentrated on getting the CTP features up to RTM quality (which means not just bug count, but with IDE support, Intellisense, debugging, and especially standard library use of the language features).

    Somehow the meme got started that we promised features in Updates, and we did not (at least, if we did, please point it out so I can correct it). I saw this happen over the winter and probably should have stomped on it sooner: Around Jan/Feb people started saying that “maybe” some of the CTP features would be in one of the Updates; then around Mar/Apr I saw people saying that “definitely” some of the CTP features would be in the next Update (which was not true and I probably should have corrected it then); then in just the past month or so I’ve started seeing it grow all the way to “Herb promised they would be in an Update” which I carefully didn’t. (And just a few days ago I even saw someone claim I promised this way back in 2010 of all things, which is two years off any statement about conformance that I’m aware of.)

    So in yesterday’s talk I tried to clarify, in words and in print on the slides, that RTM-ready conformance features would ship primarily in major releases, not Updates, because we do not ship breaking changes in Updates. It would be horribly awful if someone with a working VS 2012 project installed Update 1 and their project broke! This applies especially to library changes, but can also apply to language changes.

    We can and do break binary compatibility on major releases, in this case for the first time just one year apart between VS 2012 and 2013. And I should note that this is already a pretty aggressive schedule for pushing out breaking changes, and we worry about the risk of destabilizing enterprise customers who were used to planning two-year upgrades — a lot of shops can’t handle too-frequent feature updates. That said, this week I deliberately worded it that we reserve the right to ship “some” features in Updates, but if you happened to see a new language feature come in an Update this should be viewed as a pleasant surprise, and we would ship a feature in Update N+1 only if we could prove with very high confidence that it wasn’t going to break any projects using Update N.

    @Jonathan: Thanks, and sorry if the camera work was distracting — I haven’t seen it yet myself since the video is not yet available. I expect the slides will be made available when the talk video gets posted in a couple(?) of days. [[UPDATE a few hours later: Just noticed the video is now live: http://channel9.msdn.com/Events/Build/2013/2-306 .]

    @bames53: There’s nothing published that I know of about our compiler’s internals, AST or otherwise. It’s not secret (or I wouldn’t have mentioned it), it’s just not something I’m aware of anyone writing about. Most people don’t care how the compiler works or how hard the compiler writers labors, they mostly just care about whether it supports features and it’s the compiler team’s job to make it so without complaining about how hard it is. :) We have a good team and they don’t complain, they just say, “okay, now we need to modernize this part” and go do the design and engineering work to make it so.

    And re your breaking changes question: Right, when we do two-phase lookup it could break code that relies on the current behavior. Even though such code is nonstandard and nonportable, we would treat such a breaking change seriously and migrate people through it. One usual path (but not the only possible one) is to first ship the feature under a switch that’s off by default in one release but still warn by default on code whose meaning could change that there’s a possible future behavior change coming; then in the next major release ship the new-behavior switch on by default; then eventually someway remove the old behavior and the switch. That’s one possible example, but in some way you want to enable customers to migrate to the standard behavior without forcing a bad “fix everything now” surprise on them all in one shot.

    @Felix: Yes, VS 2013 Preview has a go-live license, so it can be used in production code. See above for Updates — we are setting the expectation that new features will appear in CTPs or in (now-faster) major RTM releases, generally not in Updates.

    @Zura: Yes, we’re hiring. If you’re a compiler developer, please do email me your resume.

    @pjmlp: I don’t think we did promise new features in Updates, but clearly somehow that expectation got out there so I’m trying to be very clear about it. I do not know the update pricing for VS — I believe that will be announced in the near future though. Note that if you’re using the free Express (and we will continue to give away VS 2013 Express as a free SKU), or have a subscription (which includes free releases during your subscription), my understanding is that you have access to new compilers for free anyway. But check with whatever is actually announced, don’t go on what I say since I don’t decide pricing.

    @jalf: See above about out-of-band releases. We did that — shipped a CTP, and then not just VC++ but all of VS on a cycle faster than the usual 2-3 year band VS always followed before. Oh, here was where I saw the comment about 2010 — the first talk I gave on VC++ conformance to C++11 was GoingNative 2012, summarized above. In that talk the key thing were tried hard to communicate with “out-of-band” was to make sure people knew that “not in this release” did *not* implicitly mean “wait another 2-3 years before seeing any more progress” — that was the long-standing and then-current VS release cadence, and we were not going to wait to release in that band but deliver things sooner in CTP and RTM form. Gladly, as it turns out all of VS is now out of that old band.

    Re pricing: Again, I don’t control or announce that, but what I can do is let you know about the CTP stream and the RTM stream, when to expect features in each stream (and by default not in Updates), and then you can decide what makes sense to purchase armed with that information.

    @Szymon: The VS 2013 release actually took the place of where we were thinking of shipping an out-of-the-2-3-year-band RTM release — all of VS essentially went out of the previous band, so we snapped to that.

    @Concerned: See above — we’re trying to be totally transparent so that you know what to expect and can decide what version to purchase or not purchase. Note that with the Express (always free) or subscription you always can get the latest release without additional cost, as far as I know (and that’s really all I know). We’re working on features, and as they meet the CTP bar and the RTM bar I outlined in the talk, they will be ready to get in the next CTP or the next RTM. But, as I said, RTM by default means major release, which is on a faster than 2-3-year cycle and this time was one year (but I’m deliberately not saying what the next cycle after VS 2013 will be because we don’t know, all we can say is that we’re working on faster cycles than 2-3 years and here’s what it was this time).

    @summerlight: We’re working on the AST in parallel with C++11/14 features, and maybe after we’re done that we can think about exposing it via an API but right now our focus is on having it internally so we can implement the Standard fully. After that we can see about considering doing more.

  15. Can’t wait to get non-static initializers and more.

    string chunk; while ((chunk = read_chunk()).size()) { .. }

    Wouldn’t while (auto chunk = read_chunk()) { … } be nicer?
    This would require string to have operator bool ( returning !empty).

  16. @pjmlp and all: Thanks for listening. As VS shifts from a 2-3 year cadence to a new faster cadence, all of us are learning it together — both internally as we figure out what’s possible and how to deliver VS faster, and externally as customers start to understand the new pattern as it emerges and learn what to expect. We’ve almost cranked one cycle on the new pattern, and it isn’t a pattern till you’ve done it twice, so this “during transition” period is the most difficult (and worry-creating) time because people have lost the previous familiar structure of the historical pattern of what to expect, without yet having a new pattern set to take its place.

    We (VC++) are excited that this means we don’t have to do our own separate out-of-band RTM releases after all to deliver more often than every 2-3 years, since all of VS has essentially moved to that kind of cadence. But it’s also understandably a period of uncertainty for everyone as we all build and figure out what the new pattern is settling down to be.

  17. I joyfully retract my previous complaint. There *is* C99 to speak of in MSVC! Praise Kernighan and Richie, I will be able to use some C99 features in universal code! Thanks Herb.

    Perhaps the sound of happy developers will show MS that it’s OK to support *all* of C99 by 2015. We won’t stop using C++, we promise, we just want cross-platform lowest-level code to be less painful.

    PS: C++1y is looking Very Nice.

  18. Whatever was said about VS2012 on OOB releases it would appear that a lot of people expected VS2012 to be given semi-regular updates to make it more C++11 compliant. The CTP is a red herring – it doesn’t have a release licence so it’s just a novelty. I know this is what happened at our company. We write an old-school MFC app and there really isn’t anything in Visual Studio for us except C++ compliance improvements (when was the last time MFC, and native desktop developers in general, got any love?). I seriously doubt we’ll be purchasing VS2013 after getting our fingers burnt, which is a shame since I think there’s a a lot of funky stuff on its way. Our only hope was that Microsoft would take all of this on board and make VS2013 a free upgrade (think Windows 8.1 after the relative disappointment of Windows 8). That’s not going to happen though.

  19. It is very disappointing that Visual C++ users aren’t going to get automatically generated move constructors and move assignment operators (“rvalue references v3”) in a RTM version of VS until likely near the end of 2014 (assuming the current VS release schedule is maintained)

    Having to write (and then maintain) move ctors/assignment operators (where you simply just want what should be the default versions) is time consuming and error prone from experience

  20. For me personally, the difference between “Out of band update for VS 2012 in 2013” and “VS2013 with the same new features” all comes down to pricing. I’m an ex-student and there’s no way I can afford to buy VS2013. The *main* thing I wanted to get from the out-of-band updates wasn’t that they would be delivered sooner (although that was great), but that they would be free for my licence for VS2012. If VS2013 is very cheap to upgrade, then I probably won’t care- see the Windows 8 pricing. If you’re going to want me to pay five thousand dollars, I’m going to be miffed, because that is going to be unaffordable for me, whereas I had previously thought these updates affordable.

    So the short is that if you offer VS2013 at a pretty cheap price for those of us who own VS2012 already, then I think that this won’t be a big PR problem. Else, it will feel to me and many others like we’re not going to get what we thought we would.

  21. I think the misunderstanding/confusion is that you’re thinking about this as a MSVC team member, and not as a MSVC user/customer.

    A year ago, your team promised C++11 conformance features at a faster cadence than the existing 2-3 year cycle.
    These updates would be delivered out-of-band, that is, without waiting for a major version of Visual Studio.

    Or, from your own quote above,

    > in Feb 2012 at GoingNative, we said we’d ship conformance features in “out-of-band” CTP and RTM updates

    I think we all agree so far.

    Now, you can draw two obvious conclusions from this (and I’d say both follow logically from the above)

    1. C++11 conformance features will be delivered at a faster pace following VC++2012 RTM.
    2. These updates will be added to the product your customers already bought – that is, if you buy VS2012 at launch, you will get the RTM version, and over time, it will gain new features (and bugfixes). (I don’t think you can reasonably interpret “out-of-band” to mean “in the next major version”)

    *So far*, this is not just your customers misunderstanding you, it is a straightforward logical deduction based on the above promise. C++11 conformance features will come, and they will come in the form of out-of-band updates.

    Then, as you said in your earlier comment, Visual Studio *as a whole* moved to this faster cadence, which means that conclusion #1 would still be true going forward, but #2 would not.

    Now, your team obviously sees #1 as the main point. You want to deliver C++11 conformance, and as far as you’re concerned, it doesn’t matter which form it is shipped in: part of VS, separately, for free, at a cost, subscription-based, distributed in a lottery, whatever. As long as you get to deliver a more up-to-date compiler, the VC++ team is happy.

    And to your team, #2 really doesn’t matter. It’s an implementation detail. Who cares if variadic templates are shipped in VS2012 or VS2013? What matters is the *date* at which it ships: “how fast can we churn out new features?”

    And if you can deliver #1 without #2, then that’s just as good as delivering both. You don’t see #2 as having any value in itself. From your point of view, it doesn’t really matter *how* C++11 features are delivered, and at the time “out-of-band” was merely mentioned as the most feasible approach.

    Or as you said above, “We (VC++) are excited that this means we don’t have to do our own separate out-of-band RTM releases after all”.

    To you, it’s a *good* thing that you were able to achieve the important part of the promise without having to do out-of-band releases.

    But strictly speaking, “out-of-band” was part of the promise given to your customers.
    Your quote above was not “we said we’d ship conformance features”, it was “we said we’d ship conformance features in ‘out-of-band’ CTP and RTM updates”.

    Now try putting on your customer glasses:

    Your customers certainly appreciate #1, because yes, C++11 conformance matters.

    But your customers also saw a fairly direct promise that there will be “out-of-band updates”, that “if you switch to VS2012, then you will eventually have access to more C++11 features than were in RTM”.

    Now, we can obviously disagree on the importance of the “out-of-band” part of the promise, but it is pretty hard to avoid the fact that it was *part* of the original promise.

    It wasn’t simply your customers misunderstanding you. Your customers latched on to exactly what you *did* say, even the parts that you thought were insignificant and just “an implementation detail in order to be able to deliver on the *important* part”

  22. @Olaf van der Spek: At my company, products will standardize on a compiler per platform, meaning for instance VS2010 for Windows. Updates are fine (thanks to the commitment from the MSVC team not to break anything in updates), but upgrades are not. We are on MSDN subscriptions, so cost is not really an issue for us at least (can’t speak for jalf of course).

    Hopefully we will be able to update the policy per product more often now that VS will update faster, but we don’t know that yet. We may soon standardize on VS2012 (currently VS2010, before that VS2005), VS2013 won’t happen until maybe next year or a year from that or may be skipped entirely in favor of later releases.

    With that said, I think that Herb and the MSVC team get too much heat from the community – of course I would want all of C++11 to be available now, but we need to be realistic about what can be achieved. Doing updates for both VS2012 and VS2013 increases the burden on the team and would therefore just slow things down.

    I appreciate the efforts from the compiler team and the commitment to more transparency. The roadmap was great to see and will help us make engineering decisions on what we can expect to use for the coming years.

  23. Hi Herb: A bit off topic but given this is on the future of C++ I thought I’d ask. What do you think of some of the newer native languages like Rust? Rust though very immature at this point, seems like it is doing a lot of things right, with it’s focus on compile time correctness and pretty syntax. How do you see C++ compete with these newer languages?

  24. =D This talk was some of the best news I’ve heard in 2013! Not sure why so many comments are thinly-veiled insults. Herb, you and the ISO C++ team and the VC++ team are doing a ton of very meticulous work and I just wanted to let you know it’s greatly appreciated!

Comments are closed.