Feeds:
Posts
Comments

Archive for the ‘Opinion & Editorial’ Category

image

What a sad, horrible month. First Steve Jobs, then Dennis Ritchie, and now John McCarthy. We are losing many of the greats all at once.

If you haven’t heard of John McCarthy, you’re probably learning about his many important contributions now. Some examples:

  • He’s the inventor of Lisp, the second-oldest high-level programming language, younger than Fortran by just one year. Lisp is one of the most influential programming languages in history. Granted, however, most programmers don’t use directly Lisp-based languages, so its great influence has been mostly indirect.
  • He coined the term “artificial intelligence.” Granted, however, AI has got a bad rap from being oversold by enthusiasts like Minsky; for the past 20 years or so it’s been safer to talk in euphemisms like “expert systems.” So here too McCarthy’s great influence has been less direct.
  • He developed the idea of time-sharing, the first step toward multitasking. Okay, now we’re talking about a contribution that’s pretty directly influential to our modern systems and lives.

But perhaps McCarthy’s most important single contribution to modern computer science is still something else, yet another major technology you won’t hear nearly enough about as being his invention:

Automatic garbage collection. Which he invented circa 1959.

No, really, that’s not a typo: 1959. For context, that year’s first quarter alone saw the beginning of the space age as Sputnik 1 came down at the end of its three-month orbit; Fidel Castro take Cuba; Walt Disney release Sleeping Beauty; The Day the Music Died; the first Barbie doll; and President Eisenhower signing a bill to enable Hawaii to become a state.

GC is ancient. Electronic computers with core memory were still something of a novelty (RAM didn’t show up until a decade or so later), machine memory was measured in scant kilobytes, and McCarthy was already managing those tiny memories with automatic garbage collection.

I’ve encountered people who think GC was invented by Java in 1995. It was actually invented more than half a century ago, when our industry barely even existed.

Thanks, John.

And here’s hoping we can take a break for a while from writing these memorials to our giants.

Read Full Post »

Ritchie, Stroustrup, and Gosling

Dennis Ritchie gave very few interviews, but I was lucky enough to be able to get one of them.

Back in 2000, when I was editor of C++ Report, I interviewed the creators of C, C++, and Java all together:

The C Family of Languages: Interview with Dennis Ritchie, Bjarne Stroustrup, and James Gosling

This article appeared in Java Report, 5(7), July 2000 and C++ Report, 12(7), July/August 2000.

Their extensive comments — on everything from language history and design (of course) and industry context and crystal-ball prognostication, to personal preferences and war stories and the first code they ever wrote — are well worth re-reading and remarkably current now, some 11 years on.

As far as I know, it’s the only time these three have spoken together. It’s also the only time a feature article ran simultaneously in both C++ Report and Java Report.

Grab a cup of coffee, fire up your tablet, and enjoy.

Read Full Post »

dmr (Dennis Ritchie)

What a sad week.

Rob Pike reports that Dennis Ritchie also has passed away. Ritchie was one of the pioneers of computer science, and a well-deserved Turing winner for his many contributions, notably the creation of C — by far the most influential programming language in history, and still going strong today.

Aside: Speaking of “still going strong,” this is a landmark week for the ISO Standard C Programming Language as well. Just a couple of days ago, the new C standard passed what turned out to be its final ballot,[*] and so we now have the new ISO C11 standard. C11 includes a number of new features that parallel those in C++11, notably a memory model and a threads/mutexes/atomics concurrency library that is tightly aligned with C++11. The new C standard should be published by ISO in the coming weeks.

[*] ISO rules are that if you pass the penultimate ballot with unanimous international support, you get to skip the formality of the final ballot and proceed directly to publication.

Bjarne Stroustrup made an eloquent point about the importance of Ritchie’s contributions to our field: “They said it couldn’t be done, and he did it.”

Here’s what Bjarne meant:

Before C, there was far more hardware diversity than we see in the industry today. Computers proudly sported not just deliciously different and offbeat instruction sets, but varied wildly in almost everything, right down to even things as fundamental as character bit widths (8 bits per byte doesn’t suit you? how about 9? or 7? or how about sometimes 6 and sometimes 12?) and memory addressing (don’t like 16-bit pointers? how about 18-bit pointers, and oh by the way those aren’t pointers to bytes, they’re pointers to words?).

There was no such thing as a general-purpose program that was both portable across a variety of hardware and also efficient enough to compete with custom code written for just that hardware. Fortran did okay for array-oriented number-crunching code, but nobody could do it for general-purpose code such as what you’d use to build just about anything down to, oh, say, an operating system.

So this young upstart whippersnapper comes along and decides to try to specify a language that will let people write programs that are: (a) high-level, with structures and functions; (b) portable to just about any kind of hardware; and (c) efficient on that hardware so that they’re competitive with handcrafted nonportable custom assembler code on that hardware. A high-level, portable, efficient systems programming language.

How silly. Everyone knew it couldn’t be done.

C is a poster child for why it’s essential to keep those people who know a thing can’t be done from bothering the people who are doing it. (And keep them out of the way while the same inventors, being anything but lazy and always in search of new problems to conquer, go on to use the world’s first portable and efficient programming language to build the world’s first portable operating system, not knowing that was impossible too.)

Thanks, Dennis.

Read Full Post »

Steve Jobs

Today our industry is much less than it was yesterday. We have lost one of the great innovators. Even more importantly, Steve Jobs’ family has lost a husband and brother and father, and our thoughts are with them.

What can be said that hasn’t been said? Steve has been arguably the single most influential driver and shaper of personal computing in every one of its five decades, from the 1970s to the 2010s. It’s obviously true for the 1970s (Apple, Apple ][) and 1980s (Mac). As for the 1990s, it should be enough that the Mac shaped essentially all of that decade’s desktop and notebook platforms, and icing on the cake that technologies pioneered at NeXT and Pixar so heavily influenced personal gaming and other personal computing. In the 2000s, suffice it to say that Steve put the personal  i  into modern computing and again transformed this industry, and other industries. Looking forward, absent some other world-changing event, it’s clear that the rest of the 2010s will see personal computing develop along the trail he and his teams have blazed already in this decade.

Here is a measure of a man’s impact: Imagine how different — how diminished — the world would be today if Steve had passed away ten years ago.

Makes our hearts fade a little, doesn’t it?

Now imagine how different — how much more — the world would be if Steve had lived another ten years.

Or another twenty. Or another fifty, as though what we have seen were but the first half of his life — and if the second half were not as a slowly aging, diminishing man, but with his health and strength and faculties as strong as ever for that much more time, a true fifty more years.

We are all cut down too soon.

Thanks, Steve.

Read Full Post »

Speaking as a neutral observer with exactly zero opinion on any political question, and not even a cyberpunk reader given that I’ve read about two such novels in my life: Is it just me, or do the last few months’ global news headlines read like they were ghostwritten by Neal Stephenson?

I wonder if we may look back on 2010 as the year it became widely understood that we now live in a cyberpunk world. Many of 2010’s top stories read like sci-fi:

  • Stuxnet: Sovereign nations (apparently) carry out successful attacks on each other with surgically crafted malware — viruses and worms that target specific nuclear facilities, possibly causing more damage and delay to their targets’ weapons programs than might have been achieved with a conventional military strike.
  • Wikileaks: Stateless ‘Net organizations operating outside national laws fight information battles with major governments, up to and including the strongest industrial and military superpowers. The governments react by applying political pressure to powerful multinational corporations to try to force the stateless organizations off the ‘Net and cut off their support and funding, but these efforts succeed only temporarily as the target keeps moving and reappearing.
  • Anonymous: Small vigilante groups of private cybergunners retaliate by (or just latch onto a handy excuse to go) carrying out global attacks on the websites of multinational corporations, inflicting enough damage on Visa and Mastercard to temporarily take them off the ‘Net, while being repelled by cyberfortresses like Amazon and Paypal that have stronger digital defenses. But before we get too confident about Amazon’s strength, remember that this definitely ain’t the biggest attack they’ll ever see, just a 21st-century-cyberwar hors d’oeuvre: Who were these global attackers? About 100 people, many of them teenagers.
  • Assange: Charismatic cyberpersonalities operating principally on the ‘Net live as permanent residents of no nation, and roam the world (until arrested) wherever they can jack in, amid calls for their arrest and/or assassination.
  • Kinect: Your benevolent (you hope) living room game console can see you. Insert obligatory Minority Report UIs no longer sci-fi line here, with optional reference to Nineteen Eighty-Four.
  • Other: Never mind that organized crime has for years now been well-known to be behind much of the phishing, spam, card skimming, and other electronic and ‘Net crimes. Not new to 2010, but seeing a significant uptick in the continued transition from boutique crime to serious organization and spear-phishing targeting specific high-profile organizations including the U.S. military.

Over the coming months and years, it will be interesting to see how multinational corporations and sovereign governments react to what some of them no doubt view as a new stateless — transnational? extranational? supernational? — and therefore global threat to their normal way of doing business.

Read Full Post »

The Inquirer isn’t normally this silly, and it isn’t even April 1. Nick Farrell writes:

Why Apple might regret the Ipad [sic]

THE IPAD HAS DOOMED Apple, according to market anlaysts [sic] that are expecting the tablet to spell trouble for its maker. … Rather than killing off the netbook, the Ipad [sic] is harming sales of the Ipod [sic] and Macbooks… if the analysts are right the Ipad [sic] has killed the Ipod [sic] Touch.

This is just silly, for four reasons. Three are obvious:

  • The iPod Touch fits in your pocket and can be easily with you all the time. Nothing bigger can ever kill it, but only replace it for a subset of users who don’t need in-pocket portability. (Besides, even if all iPod Touch buyers bought an iPad instead, the latter is more expensive and so the correct term would be not “kill” but “upsell”.)
  • The laptop has a real keyboard and full applications. Nothing not full-featured can ever kill it, but only replace it for a subset of users who don’t need the richer experience and applications.
  • Even if it was killing the other business outright, which it isn’t, it’s always better to eat your own lunch than wait for a competitor to do it.

And the fourth reason it’s silly? Let’s be very clear: The iPad has sold 1 million units in its first 28 days. At $500-700 a pop, that means the iPad is becoming a new billion-dollar business in two months.

Nick, I don’t think “regret” is the word you’re looking for.

Read Full Post »

These are the two best links I’ve read in the wake of the Flash and HTML5 brouhaha(s). They discuss other informative points too, but their biggest value lies in discussing three things, to which I’ll offer the answers that make the most sense to me:

  • What is the web, really? “The web” is the cross-linked content, regardless of what in-browser/PC-based/phone-based generator/viewer/app is used to produce it and/or consume it.
  • Does web == in-browser? No. Native apps can be web apps just as much so as in-browser ones, and increasingly many native apps are web apps. Conversely, not everything that runs in a browser is part of the web, even though most of them are for the obvious historical reasons.
  • Is it necessary/desirable/possible to make in-browser apps be like native apps? No, maybe, and maybe. The jury is still out, but at the moment developers are still trying while some pundits keep decrying.

Here are the two articles.

Understand the Web (Ben Ward)

This rambly piece needs serious editing, but is nevertheless very informative. Much of the debate about Flash and/or HTML5 conflates two things: the web, and application development platforms. They aren’t the same thing, and in fact are mostly orthogonal. From the article:

Think about that word; ‘web’. Think about why it was so named. It’s nothing to do with rich applications. Everything about web architecture; HTTP, HTML, CSS, is designed to serve and render content, but most importantly the web is formed where all of that content is linked together. That is what makes it amazing, and that is what defines it.

… [in the confused Flash and HTML5 debates] We’re talking about two very different things: The web of information and content, and a desire for a free, cross-platform Cocoa or .NET quality application framework that runs in the browsers people already use.

On a different note, speaking of the desire for super-rich in-browser apps, he adds:

Personally, aside from all of this almost ideological disagreement over what the web is for, and what you can reasonably expect it to be good at, I honestly think that ‘Desktop-class Web Applications’ are a fools folly. Java, Flash, AIR and QT demonstrate right now that cross-platform applications are always inferior to the functionality and operation of the native framework on a host platform. Steve Jobs is right in his comments that third-party frameworks are an obstacle to native functionality.

HTML5 and the Web (Tim Bray)

Again, what “the web” is – and it has nothing specifically to do with HTML. From the article:

The Web is a tripod, depending critically on three architectural principles:

  • Pieces of the Web, which we call Resources, are identified by short strings of characters called “URIs”.

  • Work is accomplished by exchanging messages, which comprise metadata and representations of Resources.

  • The representations are expressed in a number of well-defined data formats; you can count on the message data to tell you which one is in use. It is essential that some of the representation formats be capable of containing URIs. The “Web” in WWW is that composed by the universe of Resources linked by the URIs in their representations.

That’s all. You notice that there’s nothing there that depends crucially on any flavor of HTML. Speaking only for myself, an increasingly large proportion of my Web experience arrives in the form of feed entries and Twitter posts; not HTML at all, but 100% part of the Web.

On Flash · This may be a side-trip, but anyhow: I entirely loathe Flash but by any definition it’s part of the Web. It works just fine as a resource representation and it can contain URI hyperlinks.

Native Applications · A large proportion of the native applications on iPhone, and on Android, and on Windows, and on Mac, and on Linux, are Web applications. They depend in a fundamental way on being able to recognize and make intelligent use of hyperlinks and traverse the great big wonderful Web.

… So whatever you may think of native applications, please don’t try to pretend that they are (or are not) necessarily good citizens of the Web. Being native (or not) has nothing to do with it.

Good stuff.

Read Full Post »

« Newer Posts - Older Posts »