[ed. note – this post might seem a bit out of our normal oeuvre at B.J., but since a number of threads have fallen to the code wars I figured that we should offer some topical fare to tech-savvy readers]
Several months back a genial biologist in his late 70’s used the occasion of a visit to my department to deliver a riveting seminar on new ways to bring sophisticated modeling approaches into the hands of code-phobic bench workers like him and me. To be fair Dr. Oliver Smithies, who invented gel electrophoresis, homologous recombination of transgenes and the genetic knockout mouse, is not your ordinary genial biologist in his late 70’s. But despite going into some detail about a project based on recursively building a numerical model, testing with experiment and then refining the model (by far my favorite kind of project, BTW) it’s fair to say that the talk didn’t go over the head of anybody in the room, even the tech-phobes like myself.
The key to his work and the key to me following it was a nifty software package called STELLA. Figuring that time being a finite quantity, most who need numerical models don’t and probably shouldn’t know much about programming STELLA offers a dead-simple drag-and-drop interface for building sophisticated interaction pathways. The idea is to abstract model building so that anybody who can make a PowerPoint presentation can be a potential modeler. Rather than posing any threat to information biologists, whose sophisticated problems don’t lend themselves to abstraction, democratizing the basic modeling tasks frees the specialists from jobs which are often beneath their skill level.
Moving on, I thought my biology experience might help explain why I think a new idea in the programming world is overdue and why the controversy is largely misplaced. Briefly, Charles Simonyi, inventor of MS Office, longtime project manager at Microsoft and would-be astronaut wants to do for programming what STELLA did for tech-phobe biologists.
Simonyi is arguably the most successful coder in the world, measured in terms of financial reward and the number of people who use his creations. […] He is obsessed with a project that he has pursued for a decade and a half, and that four years ago carried him right out of Microsoft’s doors. He is proud of his profession. But he is also haunted by the thought of what programmers must contend with each time they sit down to code. He asks, Why is it so hard to create good software?
[…] Simonyi’s ambition is to unstop that software bottleneck–characteristically, by going meta. He’s developed an approach he calls intentional programming (or, more recently, intentional software), which he hopes will overturn programming. If Simonyi has his way, programmers will stop trying to manage their clients’ needs. Instead, for every problem they’re asked to tackle–whether inventory tracking or missile guidance–they will create generic tools that the computer users themselves can modify to guide the software’s future evolution.
[…] Intentional programming would add an entirely new layer of abstraction to the practice of writing software. It would enable programmers to express their intentions without sinking in the mire of so-called implementation details that always threatened to swallow them.
Not being a programmer, I will let more experienced hands explain the difference between Simonyi’s project and existing forms of abstraction like Visual Basic. Nonetheless I take it from the enthusiasm of the staff at MIT Technology Review that he is on to something new. Critics have a few concerns:
Some theoretically minded skeptics say Simonyi’s goal of capturing computer users’ intentions is implausible. “How do you represent intent?” asks computer scientist Jaron Lanier. “As soon as we know how the brain stores information, maybe we can represent intent. To me it just seems like a fantasy.” Another argument, common among programmers, is more practical. Many programmers love their text-based editors and distrust tools that distance them from raw code. As for graphical programming languages like Visual Basic and the integrated development environments (IDEs) that automate routine programming tasks, they regard them with condescension: such tools, they say, impose their own ways of doing things, constrain creativity, and keep programmers from the code that, sooner or later, they must confront.
To be honest the first strikes me as bluffing through overtechnicalizing a relatively simple idea. Intentional software doesn’t need to mimic the biology of human intentions. In fact it shouldn’t, since our brain operates nothing like a silicon chip. Code that is meant to run on silicon ought to reflect the logical, linearized way that silicon chips operate, and it seems fair to say that we understand that fairly well. If Intentional code doesn’t give us tools which exactly mimic the way that we think, well, when has code ever done that?
As for the second point, I think that critics are misunderstanding the likely impact of Simonyi’s work. Thinking again about STELLA, the most likely users will be people who might need code but don’t have the experience or time to pick it up. Even more than I think Simonyi realizes intentional design will democratize the process of making software without shutting out dedicated coders from software’s gnarly roots. Some problems are too new, complicated or unique to abstract, but many are not. It seems like a waste for a programmer who could be working on the former kind of problem to spend his/her time on the latter.
Grrr
Graphical IDE’s can be great time savers, but I don’t recommend them for folks without at least a passing familiarity with the basic disciplines of coding.
I’ve seen too many middle-managers buy one of these toys and think “Hey, this programming shit ain’t so hard”. Next thing you know they have a departmental RAD team churning out tons of useless bug-laden crap that everyone quietly hates.
Sal
Speaking as a programmer, I don’t see what this is about. As far as I can tell, this is something that would let some non-programmer code stuff without knowing code. A lot of wizards in Office, for example, might meet this criteria.
But you still need programmers to code the programs to do that. Unless they’re talking AI, we’re not there, and maybe don’t want to be.
RSA
Tim F.,
That’s an excellent and thoughtful post, especially for a self-confessed technophobe. It would do credit to a computer scientist (this is praise, in case that’s not clear :-). Here are a few thoughts.
First, the goals of his Simonyi’s work are laudable and shared by probably thousands of researchers in computer science and hundreds of thousands of people in industry. That said, Simonyi hardly publishes anything, even about intentional programming, so it’s hard to evaluate his ideas. Will they work? Time will tell. He’s been working on this for over ten years, and there’s really nothing to show for it yet. We can see any number of similarities to existing work (for example, in object-oriented programming, aspect-oriented programming, the kind of macros supported by Common Lisp and a few other languages) but these generally haven’t led to revolutions in software development, though there have certainly been striking success stories. Validation is missing, and so while intentional programming (like literate programming) sounds great, we don’t know whether or how well it will work yet.
Second, I think that Simonyi’s view of intentional programming as a silver bullet for software development is not misguided, but it’s still only part of the picture. One interesting question to ask is what abstractions are important in a program? There are subtleties here that current acccounts of intentional programming seem not to address. A distantly-related anecdote may illustrate: in a talk I heard last year, Pat Hayes, a philosopher who works in artificial intelligence (in particular common sense reasoning), described an office-cooler debate he observed years ago between developers working on CYC, a common sense reasoning engine. Say you have a can of paint sitting on the floor in the middle of a room. The paint is in the can, and the can is in the room, so the paint is clearly in the room. A painter comes in and paints the walls of the room with the paint. Is the paint still in the room? At what point does it become part of the room? The debate moved to whether carpeting is in a room or part of the room. Pat’s conclusion was perhaps a natural one: if you have to argue about the semantics of such difficult-to-decide issues in a common sense reasoning system, you’re probably on the wrong path. Similarly, I’d say that the main problem faced by Simonyi’s approach is formalizing a programmer’s intentions. Providing ever-more-flexible layers of abstraction and mechanisms for mapping to specific computational models is ignoring some very hard questions.
Third, I’ve done considerable work in the area of intelligent user interfaces, cross-disciplinary field in computer science between artificial intelligence and human-computer interaction. A good number of research efforts in the 1990s, aimed at automatically generating user interfaces based on abstract models that sound not too far off from what Simonyi is talking about, basically petered out: the problems are just too hard to solve. Formalizing domain knowledge in sufficiently precise form to be automatically translated into usable interfaces is not yet feasible; the problems aren’t in development environments but in how we understand foundational concepts, e.g., what users are trying to do when they build and use interactive computer programs. Intentional programming doesn’t seem to address this point.
RSA
Oops, I also meant to comment on this point:
Alan Cooper, a famous guy who has written a mixture of stupid and intelligent things, gives a nice example of the problems faced by systems for end-user programming. Imagine that you have a national company database, and you want to find the names of people at two plants, one in Texas, the other in California. A perfectly idiomatic way to phrase this question, in English, is “Find the employees in Texas and California.” Depending on the way this is translated into Boolean (logical) form, it might turn into this: “Select employees where location = “Texas” and location = “California.” Of course, this will return only people who work in both states, i.e. no one at all. The problem here isn’t only writing a “smarter” translator, but rather getting non-programmers to realize that the system is making a particular kind of mistake. As I tried to describe above, formalization is the difficult problem.
Keith
I thought MS abandoned intentional programming as an MS Research project years ago. It was kind of an encapsulation of the algorithm instead of the class (an object-oriented programming term), but I’d figured that work wound up getting rolled into .Net in a much less ambitious format.
demimondian
Like RSA, I wish Simonyi the best, and like him/her, I’m inclined to doubt the STELLA will succeed.
I truly wish we could democratize software, but I doubt we ever will. There’s nothing so exhilarating as hitting return, and having it *work*. (In fact, it can be really amusing — some grey-hair sitting in his or her chair, typing, does something small, and then breaks out with some touchdown dance which would do credit to T.O. Of course, nobody else knows what happened, so they look even stupider than Owens does.)
It’s so exhilarating because there’s so much tedium in getting there, though. Not only do you need to write the code in the first place, you then need to make it work. It’s an old truism that a programmer spends 80% of his or her time debugging. That fact stays pretty much constant, like the old Fred Brooks number that, independent of language, a programmer, on average, can write about 100 lines of fully debugged code in a day.
Unfortunately, we have a name for people who find debugging tolerable. We call them “professional programmers”. Normal people find it repulsive, and I don’t see that changing any time soon.
demimondian
I didn’t know that IP had ever been a topic at MSR. It’s always been Simonyi’s particular interest, though.
ThymeZone
In business software, it’s about data.
Data obeys the laws of data, which are immutable in the same way that the laws of physics are immutable. Removing abstraction layers is only part of the problem.
If you want a good example of what happens when you try to hide reality from computer users, look at the nearest MS Access “application” in your office. It’s probably junk. It’s junk because you can’t make pig lips look like petunias and expect the result to be a program that accurately diagnoses pig diseases for amateurs. That’s a hideous metaphor and I may burn in hell for it, but the point is in there.
You can’t pretend that the abstractions necessary to deal with data can just be skipped over, sugar coated, turned into point-and-click interfaces, or the details subsumed into “wizards,” and get good results.
Good luck to Mr. Simonyi and his elves. I’m sure they can spend years happily chasing fairy tales.
Keith
It reminds me a bit of aspect-oriented programming in terms of the drag-n-drop ideas of adding functionality. AOP is possible to hack in in .Net (although the pathway is one that will probably mean apps using it won’t work a few versions down the line), but where I found it to be especially deficient was in its inability to relate aspects to one another. For instance, I can drop on a “Log method calls” aspect, and it’s largely fire and forget, but if I need multiple aspects to work in tandem, it becomes very vague as to how to describe the interactions declaratively.
OCSteve
It’s not. Right up to the time people try to use it, it is damned perfect.
For the most part, developers are in solitary confinement. They get fed requirements that have been written by committee. GIGO.
For my part, I will never hire a developer who has not worked substantial years (5-10) working tech support and implementations.
IMO, it is critical that developers have years of experience dealing firsthand with those directly impacted by the software. You take a kid out of school and make him/her a developer – this is what you get. I want someone who has had years of face-time with real people.
Users suck. If not for them, writing software would be nirvana :)
ThymeZone
Heh, in another time, we could have a great argument, OCS.
After twenty years of slinging code before I took my present gig, I concluded long ago that most business software is crap … precisely because it is written for and by programmers. Most Internet and e-commerce stuff is crap for the same reason. IMHO opinion, of course.
As for users and “requirements,” most development processes have no concept of proper requirements or their place in proper process. So there you go. We’re all doomed.
Andrew
Well, I don’t see what the problem is. All software will be written by high motivated Indian and Chinese coders who will work for 1/10th of the cost of a much more lazy American.
Not our problem anymore! Let’s get some beers!
ThymeZone
Andrew nails it again.
It’s Miller time.
demimondian
And that goes double for most open source stuff. There’s a lot of coding that Just Isn’t Fun.
I see you like pizza pie, Herb.
demimondian
Actually, the figure is quickly evening out. There seems to vbe more need for coders than there are coders.
Hyperion
and
yes, please explain. i am not a software engineer but i play one at work. ;=)
i’m a chemist but my company needed more coders so i waded into C++ ten years ago. i also volunteered to take on some VB6 and database projects.
i know this is not what a REAL software engineer would say but….i appreciate a language that hides complexity.
carefully designed wizards can be time-savers, like validation controls that VS8 provides to “automate” the tedious input checking necessary for robust apps. but someone still has to make an overall project design and then reduce it to myriad details which then get implemented.
i guess i’m a little cynical about this. and skeptical. human brains are amazing things. computers? not so much. to me this is the lesson of AI.
RSA
I teach my computer science department’s courses in human-computer interaction. At the undergrad level, I’ve had students tell me that they didn’t learn anything that was really new, but it was still worthwhile, for the difference in perspective. (I go light on the theoretical issues.) It turns out that getting computer science students (and professional programmers, for that matter) to put themselves in the position of someone who’s not a computer scientist is extraordinarily difficult. For anyone who’s seen the TV show House, most programmers are Gregory House, deep inside, when they talk with people outside the field. (“What kind of moron would press the Shift key when my application says, ‘Press any key to continue?!'”)
OCSteve
Exactly. Give me a buck for every time I’ve heard, “it worked for me in the Dev environment” and I’m retired tomorrow.
Based on your comment – I assume you survived Y2K. Were you on call that night? I was… Heh. Man did we (I) make money off that scam.
The best bipartisan proposal I have heard in some time. I’m in. Corona for me. I suppose that makes me anti-American. Screw it.
demimondian
Not me, but FDDD was. What a clusterfuck.
And I’m so glad that I will be retired before roll-over day comes in 2027.
ThymeZone
We spent a year doing Y2K remediation. I did almost nothing else during that time. We were tested and ready by mid-1999. I had no worries on Dec 31. Nothing broke in my world.
What happened next was the the job market became flooded with people who had been employed in Y2K initiatives, and hasn’t totally recovered yet. This has put a dent in my insatiable pursuit of greenbacks. Not any serious damage, but it’s frustrating to watch my adjusted-for-inflation income shrink or stagnate lo these several years.
At this rate I don’t know where my next annual new car is coming from. Okay, I admit to being neurotic about new cars (I’d drive a new one every 90 days if money were no object) but the price of groceries ain’t staying put. Or electricity, or natural gas, or fuel for the cars, or anything else.
OCSteve
I don’t figure I can actually ever retire (at least in the commonly accepted definition of the word). But yeah, 20 years from now I surely won’t give a damn.
RSA
Is it always the same model? (I’ve tended to buy a car that combines what I like with what I can afford and driven it into the ground, at which point I repeat.)
OCSteve
Mine neither. The beeper never went off. We spent the last half of 99 in a panic. I held my breath at midnight, then checked a few systems, and went to bed.
I hired and retrained about 20 COBAL programmers in 2000. In ’99 they could get 120k. In 2000, they were on the street. Some day that will be me I guess.
Andrew
I have a lot of faith that I can out-lazy any ten mediocre Chinese coders. I will preserve the ratio to the best of my ability.
No, it just means you have bad taste in beers. I propose “anything but Corona” as a superior beer choice.
OCSteve
You go into Friday night with the beer you have cold in the refrigerator – not with the beer you wish you had.
OCSteve
Err, Saturday?
Andrew
Did Turkey prevent you from bringing in the 4th India Pale Ale Division via the Northern route?
OCSteve
Yeah. Bastards.
It could be worse. I’m known to drink Coors light on occasion. Yes, I am ashamed.
demimondian
You should be.
Look, it really isn’t hard to make your own beer, and before you’re reduced to drinking that kind of swill, you should be a man, and spend two weeks fermenting the wort into beer, bottling it, and letting it condition.
tBone
Especially if it’s Coors Light. All you need is a real beer, an empty bottle, and a half-hour or so.
bago
Aspect oriented programming is being done under several movements at MS with things like F# and there is much work being done under the name HIP in MSR.
Cool stuff. I’m still aglow from C#, so I can’t wait for the future.
bago
Also, I’ve seen ads for the VIBE team, and I really want to move into a group like that. Better start studying.
Pb
Uh. I stopped reading somewhere around here:
Great. Not content with having gotten millions and millions of people to buy (!) and use his crappy software, now he wants to sell them a tool to make their own knock-off crappy software? Fuck that shit–one MS-Office was bad enough.
All that having been said, of course you can build tools that will make application development easier, especially in a limited domain with specific needs. For example, I got the role-playing game Neverwinter Nights 2 for Christmas, and it comes with a toolset that lets you design your own modules. In no time at all, you can be wandering around a field of your own design, running into trees, talking to cats, all beautifully rendered in three dimensions, etc., etc.
RSA
I wonder if an intentional development environment might have the unintentional effect of producing a lot of write-only software. Based on what I’ve read of Simonyi’s ideas, I also wonder whether the kind of programming that he’s talking about might only be possible by really, really good programmers. We know that there can be a couple of orders of magnitude in effectiveness between good and bad programmers; I find it plausible that some languages might be great in the hands of a great programmer but disastrous when used by most other people.
The Other Steve
Some intersting ideas here. And some random comments.
I think what Simonyi is talking about sounds an awful lot like 4GL. It therefore seems to me that we’re already doing it. Hyperion is more right than he knows. Nearly everybody appreciates a system which hides complexity, as long as the complexity is done in a seamless manner which doesn’t introduce bugs you can’t fix.
Over time, we have automated things to hide complexity. Back when i started developing in the 1980s, if you wanted to support a mouse and put a button on the screen for the user to click on, you were doing a *LOT* of hand coding. Now the mouse is handled by the OS, and the button is one line of code.
What Simonyi appear to be talking about is integrating in more of the business logic. That is, the idea of point and click reporting, etc.
But generally that’s not the hard part of development. That’s the easy part.
In my business apps, I probably spent 20% of my time coding in the “business logic”, and the remainder of my time trying to figure out how to do other things which are hidden by the technology.
Over time that changes. Like the button on the screen, the technology evolves to automate the complexity.
But the problem is, that as we automate the complexity, this allows/encourages us to do even more.
So it’s an interesting concept, but it’s no silver bullet. Again it sounds like 4GL. And to that, you can read “Rise & Resurrection of the American Programmer” by Yourdon to understand why this will never happen.
demimondian
You know, I hear this all the time. Want to explain to me why MS Office is so bad?
The Other Steve
It’s Microsoft therefore it’s EVIL.
The Other Steve
You know, the other thing I realized. A lot of business software these days is written using rules engines. Such as iLog and other vendors.
In many of these cases, the rules are actually editable with a GUI tool and can be maintained by savvy business users.
RSA
There are lots of possible explanations, as I’m sure you know. One is that Office applications have gotten too big and complex to be easily used. A paper I saw a few years ago tested one aspect of this: the authors built a minimalist front-end to Microsoft Word, one that allowed users to add menu options from the full-up system to their personalized version, incrementally [PDF]. Results generally favored the personalized interface, especially among “feature-shy” users. Office’s one-size-fits-all approach works about as well as it does with clothing.
Andrew
Because people have spent tens of billions of dollars on it and office productivity has not seen any measurable increase in the past decade. I’m willing to bet that the usefulness of any new feature has been more than doubly offset by its bugs, viruses, and retraining time.
WordPerfect 5.1 FOREVER!
demimondian
Actually, office productivity has skyrocketed; that’s one of the things which has kept us alive through the recent years. The productivity increase is hidden, though, by the removal of support personnel who used to type and file and add and tabulate.
fwiffo
Nothing to see here, move along.
I’ve been a professional programmer for a long time, and it’s been a hobby of mine for a while before that. There’s always somebody threatening to change the whole paradigm of software development. People introduced functional programming (Lisp, etc.), and object oriented programming, and aspect oriented programming, and now people are really excited about functional programming again. Some people gather together techniques like CRM cards, and pair programming and “patterns” and call it “extreme programming”. Now this joker has “intentional programming”. Some of these things are useful tools, but that just means you can tackle bigger and more interesting problems.
Guess what – people have been doing this ever since programmin was invented, and programming is still hard! I don’t know what it is about it that make people think it’s easy or that it ought to be. Every other serious professional job is hard and requires special training. It’s pretty fucking insulting that people think that anybody can program. Imagine if somebody claimed they had invented some magic new tool that made engineering really easy, or made it easy for anyone to be a lawyer without the hassle of law school, or made it simple for anyone to practice medicine. It would be a joke!
Any serious profession requires years of training and experience to get good at, and even then, most people will only be OK, not great.
I’ve been programming most of my life, went to college and got a degree, and spent years in the industry. But I still get people expecting me to teach them everything I know in a month. They buy a book and expect to become an expert in 30 days. I just don’t understand it.
This guy seems to think he’s special for rediscovering the Turing tar pit. Good for you, buddy.
demimondian
Interesting, since the Office UI models exactly that in 2003. The geeks, of course, hate it; as it hides the power features that make them look cool to their co-workers. In 2007, the UI is modified to try to give the feature-shy a chance to see some of the features they almost certainly want, but don’t know how to get.
We’ll see how it works out.
RSA
The authors (I don’t know from personal experience, not being a “power” user of these kinds of applications) claim that the difference is between adaptable (user explicitly adds access to a function) and adaptive (system infers a need and makes access available automatically) interfaces. The difference may be subtle, if this is the case, but apparently it has a significant effect on user acceptance.
This will be interesting to see. Microsoft has lots of good ideas; maybe there’s something new.
RSA
See CAD/CAM, systems like EULAlyzer, and WebMD, respectively. (That is, see some claims made about them, rather than systems themselves.)
Well, yeah.
demimondian
On the other hand, I’ve played with house design programs which did generate legitimate architectural drawings, with correct loadings, cross-sections, and the like. Sitting here in the odd little nook of my home office that I added for my desk, and which I only knew was possible because I had played with such a program, I’ll tell you that (a) I’m still not an architect, and (b) I was able to explain to the architect exactly what I wanted much better because of that program.
RSA
That’s another thing missing from the Simonyi story: Working through a problem in enough detail to program it, with collaborators especially, can really give people a better understanding of the problem and what counts as a good solution.
My brother just got a job in an architecture company in NYC. He thinks Google SketchUp is pretty cool, though I haven’t tried it myself.
Andrew
I willing to bet that absolutely none of those productivity increases have come from Microsoft Office. Back end software, sure. Better operating systems, yep. Networking, of course. Mobile platforms, okay. Microsoft Office? Ha!
What, exactly, has any recent version of Office offered over, say, Office 97 or 2000, that has been worth thirty billion dollars?
Windows has also reached this point. There is basically no compelling reason to upgrade from XP for any end user.
Windows and Office are basically unstoppable bureaucratic monsters, much like the defense industrial complex. No one really wants or needs a new version of Office or a nuclear submarine, but there’s money to be made if you can buy the guy making the purchasing decision (your Congresscritter/IT guy).
demimondian
You’d lose that bet, you know.
Most people can get away with repeating the old canard because they define “Office” to be “Word” or “Excel”, or, perhaps, rarely, both. I challenge anyone to shrug off the effect that Outlook in 2003 or 2007 had on productivity. You may not think that simple things like making scheduling a meeting easy matter, but, believe me, that I don’t need to call your admin, have her (or, rarely, him) find you, then get back in touch with my admin, and have them negotiate on our behalf…that’s huge.
As to never upgrading XP, you’re wrong there, too. The Aero interface moves a lot of the computation involved in rendering the UI off the CPU and onto the GPU. Not only does that make useful eyecandy easier (have you played with the ALT-TAB menu under Vista?), but it also offloads a lot of computation onto a high-bandwidth device which is designed to handle it.
(Ironically, I never worked on either Windows or Office. I worked on all the other pieces you list while I was at MS, by never on either of the Big Two. Funny hoe that happens, eh?)
Andrew
OMG automatic meeting scheduling! Something Meetingmaking and dozens of other product have been doing for 20 years.
Woo.
Surely worth 30 billion.
And fast alt-tabbing!! OMG111!!!one! That is totally worth spending $900 to upgrade my computer to be Vista handi-capable. I can now switch to another app in 0.00002 second instead of an abysmally slow 0.0004 seconds!
Geez. How much is microsoft paying you? Seriously. Faster alt-tabbing? I should buy a $200 video card, an extra gig of ram, a new dual core CPU, and a $200 piece of software for faster alt-tabbing?
demimondian
Since I don’t work for Microsoft any more, your first accusation is kind of foolish isn’t it? And, more, when I worked there, I didn’t work on the big two, like I said. So, you’re wrong on two points.
Thus, you’ll need to forgive me if I point out that this
is just stupid. I can break the best of the “others” in five minutes or less. Hell, I can break Notes, which is orders of magnitude better than the shit that tends to get promulgated as ‘calendaring software’. Meetingmaker, for instance, does not have an integrated scheduler, nor does it provide support for multi-resource scheduling. Guess what, dev? People out there in the *real world* care about that.
And it isn’t fast alt-tabbing; in fact, it’s slower than XP’s. It is, however, far more usable, integrating things like the mouse wheel (by the way, a Microsoft introduction).
Andrew
Maybe you’re typing this on a brand new Ferrari Acer laptop? I kid, I kid.
So, what you’re saying is that really good scheduling software has been worth 30 billion in direct software costs and probably a few score billion more in secondary support and upgrade cycles, versus supposedly mediocre options?
In other news, Microsoft declares victory over rendering HTML in a competent manner.
Andrew
In other news, most users do one thing at a time. If they do more than one thing, they can remember that their e-mail application is also running and they click the little task bar icon when they want it.
Fancy application switching is worth 10 to 20 cents, tops. Where’s the other value?
demimondian
Nope. My current employer would have required me to send it back as a conflict of interest.
I’m really reluctant to engage on the “$900 Vist machine canard”, but better to be shorn as a sheep as a lamb…
First, most people won’t upgrade from XP directly, but will, instead, change computers, at which time they’ll get a new license, which will typically be for the most recent version of Windows. Thus your $200 license cost for Vista is not a real cost; computers wear out, and need to be replaced, and, yes, you buy a new license for a new machine. (If, of course, you actually bought a retail license for XP, *and* you throw out the old machine, then your first upgrade makes it worth the price, since you can move your OS over to the new machine.) Thus, you’ve just said that Vista is worth it for them.
Now, let’s look at the three-year-old box I’m typing on right now, which was bought with 512K. I could run Vista, with all the sparkles and twinkles, on this box, and it would perform perfectly well. Go chase the various perf tests that have been done on the web; the Slashbots have lied to you about needing the extra hardware. There goes the other seven hundred. So, the question is “is it worth $200 to run Vista on this computer?” and the answer is “not to me, but I will enjoy it whenever I do spring for a new box”. (Which, in my case, is likely to be in about two years.)
demimondian
And, what I’m saying is that the upgrades would have happened anyway, and that, yes, really good scheduling software is absolutely worth that.
B-b-b-but, you say, why do big companies upgrade? Well, it has to do with their cost structure. They’re going to need to support whatever version or versions of Office they run. They can choose to run a heterogeneous environment if they want to. They aren’t even “forced” to upgrade from out-of-support versions; like IBM, Microsoft Consulting can and does enter into agreements to support old software on a case-by-case, pay-for-time basis.
All that said, it is very expensive to support older versions of any product, and it is even more expensive to support more than one version of a product. Thus, most companies choose to move from one version to another all at once, not for the features, but for the support costs; it’s cheaper to pay the volume licensing cost to move than it is to stay on the current version.
And this all explains why it’s cheaper to buy Office than Meetingmaker. Meetingmaker has a different management metaphor than WordPerfect, which has a different metaphor than Linux, which has a different… Those divergent metaphors raise your support costs just like a heterogeneous Windows environment. Yes, you can buy OpenView from HP or Tivoli from IBM to *help* clean that experience up, but…why do that when you can buy a unified management experience from Microsoft? Again, overall, it is cheaper.
Andrew
So, in summary, you’re happy paying a Microsoft fee of about $100-200 every year, amortized? Even though it doesn’t add any functionality?
Here’s the kicker: Would it matter one bit whether Vista arrived in 2006, 2007, or 2008? The answer is no. Because it DOES NOT MATTER. Microsoft creates new software because it doesn’t know what else to do. There is no real progress at Microsoft in it’s OS and office divisions.
The world would be better off with a feature freeze and security fixes.
demimondian
Given that the evidence supports the thesis that the alternative is considerably more expensive, I am absolutely willing to consider that an annuity I pay for computing.
And as to this:
If, in fact, the world will be better off with that, I’ll still wind up paying the annuity, because I’ll buy new computers, whether or not Microsoft releases a “new” operating system for them or a new version of “Office”. So it’s totally irrelevant to my original question, which is “why is Office so bad?”
It does a difficult job, and it does it extremely well. It’s certainly not perfect, but, like you said, if no other version were ever released, we’d not be any poorer. So, what’s so bad about it?
RSA
I think that with respect to standard office productivity applications such as word processors and (especially) presentation software, this is probably the case.
Andrew
Demi, I demand that you pay me $200 a year to be mediocre and not produce any value! I’ll see if I can convince Dell to add this as a special cost to their PCs.
I wish I could get $30 billion for never really improving.
demimondian
If Windows and Office were, in fact, mediocre and without value, then that’d be a great line. Since I’ve already explained why that isn’t so, though, I’m afraid it’s a bit of a flop.
As to “never really improving”…well, ok, but you’ve already said a bunch of times that Office is basically complete, that there are no new features it needs. That’s the foundation of a commodity business, then; it won’t get any better, except in that it might be produced or distributed more efficiently. If a product exists that meets a broad need, and it doesn’t need to get better, then why shouldn’t it provide income?
demimondian
Don’t lie to yourself: just as UI features are real features, security features are also real features. In a very real sense, you can’t have both a feature freeze and a batch of security fixes.
In fact, many security features have broad and subtle effects on the UI. Things stop working as users expect, and it’s often quite difficult to get them to understand that the particular thing that they used to do one way is still possible — just not that way.
Andrew
Price should drop to approach costs as development costs are recouped. And with software, marginal costs are near zero. (We know support is additional cost…)
Luckily, I get academic pricing (aka basically free) so this is all sophistry!
demimondian
Well, you know this, but on-going support of released versions consumes on the order of 25% of most revenue for established software lines. (That’s true across the board in the industry, not just in Woffice.) And prices drop only in the presence of real competition — and, despite having spent many billions of dollars on it, nobody has ever come close to replicating Office. (I know — I run OpenOffice.org’s abomination every day at work.)
Not me, dude. I pay full retail.
(I’m told that one of the benefits of exmsft is access to the company store. Price still isn’t as good as academic.)
RSA
I’m hardly lying to myself.
Andrew
Geez, RSA, you’re so critical. Much like security fixes are a feature, that fact that your car doesn’t explode arbitrarily is a feature, and that your TV doesn’t electrocute you when you touch it is a feature.
demimondian
Well, actually, “Security fixes are a feature” kind of like “your car doesn’t explode when someone intentionally targets it with a bomb” or “your TV doesn’t electrocute you when you deliberately short yourself across the power supply while its plugged in”. The fact that thugs will do that, whether for fun or profit, and that people will happily click on the pretty “download todays version of mySpyware (now with new improved pr0n!)” is a fact of life.
Andrew
To carry the analogy past the field of dead, beaten horses, Microsoft security features are like having a car insulated with TNT, and a TV that has capacitor leads sticking out of the on-off button.
demimondian
Hey, this is fun!
That is how some advanced tank armor actually works, though, so it’s not a great choice. :)
Bob Munck
I’ve been working in this area pretty much my whole career, from teaching programming at Brown in the 60’s through the Ada years to the DARPA STARS monster program in the 90’s and beyond. My strongly-held conclusion is that we don’t need to improve the programmer’s ability to write code; we need to improve the ability of an organization with 50 programmers to create working software. (Or, conversely, we need to change the world so that we never need software that has to have 50, or even 5, programmers to write it.) Intentional programming doesn’t appear to me to address that. It’s just another turn of the wheel that started with Larry Constantine’s measures of modularity, rolled through object-oriented design and domain-specific programming.
STARS ended up with domain-specific, and I’m not seeing anything here that wasn’t considered there, a decade ago. STARS, btw, spend half a billion DARPA dollars attempting to improve DoD software productivity by a factor of 10. It achieved an improvement by a factor of 1.0.
Andrew
Fred Brooks agrees with you, Bob.
And demi, I look forward to the day when I can get explosive reactive armor in my car bumpers. A 120mm gun would suffice, however.
Bob Munck
As a side note, it bugs me that these articles about Simonyi claim that he invented WYSIWYG word processing at PARC. The Brown HYPERTEXT Editing System was WYSIWYG (and in multiple windows) in 1967. Simonyi would know that; one of our students who did HYPERTEXT, Bob Wallace, was at Microsoft in the early days. Grumble, grumble.
demimondian
Now you’ve got me obsessing about how to install the platform so that the recoil wouldn’t flip the car the first time you fired it.
RSA
People so quickly forget. One of the assignments I’ve given my students is to do a point-by-point comparison of the WWW with memex (another piece of work, if only hypothetical, that more people should know about); they’re usually surprised at how much Vannevar Bush got right. A couple of years ago I saw a talk by Alan Kay in which he showed a video of Sutherland’s 1963 thesis work on Sketchpad (which I think is online somewhere); really good graphical user interfaces also have a much longer history than many people realize.
Bob Munck
And Andy van Dam’s thesis work in graphics was around that same time. Andy, my friend and mentor, was the main force behind HYPERTEXT. (Yes, Ted Nelson was there, but his pupils were always very small.) We were, in very large part, inspired by the MEMEX article. I’ve always wondered about the fact that Andy was on sabbatical at CERN around the time that Tim Berners-Lee was doing his original work.
My old boss at SofTech, Doug Ross, has some claim on having invented the whole idea of computer graphics at MIT. We needed hypertext just to capture all this stuff.
The BITBLT instruction on the Alto was important, though. It made overlapping windows possible with the hardware of the time.
RSA
Wow, you were in the thick of things very early on. (I just googled your 1967 paper with van Dam.) Some pretty cool stuff has come out of Brown in computer science, though my knowledge of it is pretty much limited to their AI work.
Bob Munck
I’m the Zelig of the computer business. For instance, I was at Xerox El Segundo trying to make PARC stuff into a commercial product. I had Alto #35 and was about the 50th person on Ethernet. My (then future) wife and I exchanged what could be seen as email in 1970, her in Toronto and me in Providence.
Andy van Dam, though, is the Godfather of CS, the Hardest Working Man in computer biz. (Don’t worry about the James Brown analogy; at the end of any given year, Andy’s students look five years older and he looks a year younger.) His (prematurely old) former students are running CS departments all over: Guttag, Lazowska, Sedgewick, Tompa, Bergeron, etc. Also places like Pixar, Microsoft, Google.
RSA
Cool. It doesn’t appear that anyone in graphics maintains a geneology list, as in software engineering and in AI, but that would be interesting to see. I think anyone I interact with at all regularly would be one or two generations away from van Dam; I do see that one of my friends from grad school published a paper with Bergeron a few years ago.