Wednesday, April 26, 2006

The Untrustworthy Web Services Revolution

I'm a little behind on my reading, and just got to Ken Birman's column in the February issue of IEEE Computer. It's a very clear assessment of some of the risks that society is running willy-nilly in its rush to take advantages of the very real benefits of the Web.
If spyware slows down my PC, that's inconvenient. It's a far more serious matter if vulnerabilities allow an intruder to wire-transfer my retirement savings to Nigeria, kill a patient in an intensive care unit, or launch a cruise missile from a Navy warship...

Breaking the cycle is going to require a response on many levels. The problems we're confronting have ethical, legal, and economic dimensions as well as technical ones:

* Why do kids view breaking into computer systems as a game?
* Why aren't we insisting that operators of sensitive computing systems have an obligation to maintain security, and forcing them to carry liability insurance to compensate anyone damaged by their failure to do so?
* Why is the technology economy so focused on software product quality on a per-product basis and indifferent to the inadequacies of systems built by integrating components using those products?

Labels: ,


Monday, April 24, 2006

Business Week on the computer brain drain

Business Week has published an article on the latest defeat of US programming teams at the annual ACM International Collegiate Programming Contest.
America's dismal showing in a contest of college programmers highlights how China, India, and Eastern Europe are closing the tech talent gap

Ben Mickle, Matt Edwards, and Kshipra Bhawalkar looked as though they had just emerged from a minor auto wreck. The members of Duke University's computer programming team had solved only one problem in the world finals of the ACM International Collegiate Programming Contest in San Antonio on Apr. 12. The winning team, from Saratov State University in Russia, solved six puzzles over the course of the grueling five-hour contest. Afterward, Duke coach Owen Astrachan tried to cheer up his team by pointing out that they were among ``the best of the best'' student programmers in the world. Edwards, 20, still distraught, couldn't resist a self-deprecating dig: ``We're the worst of the best of the best.''

Duke wasn't the only U.S. school to be skunked at the prestigious computing contest. Of the home teams, only Massachusetts Institute of Technology ranked among the 12 highest finishers. Most top spots were seized by teams from Eastern Europe and Asia. Until the late 1990s, U.S. teams dominated these contests. But the tide has turned. Last year not one was in the top dozen.

The poor showings should serve as a wake-up call for government, industry, and educators. The output of American computer science programs is plummeting, even while that of Eastern European and Asian schools is rising. China and India, the new global tech powerhouses, are fueled by 900,000 engineering graduates of all types each year, more than triple the number of U.S. grads. Computer science is a key subset of engineering. ``If our talent base weakens, our lead in technology, business, and economics will fade faster than any of us can imagine,'' warns Richard Florida, a professor at George Mason University and author of The Flight of the Creative Class.

Software programmers are the seed corn of the Information Economy, yet America isn't producing enough. The Labor Dept. forecasts that ``computer/math scientist'' jobs, which include programming, will increase by 40%, from 2.5 million in 2002 to 3.5 million in 2012. Colleges aren't keeping up with demand. A 2005 survey of freshmen showed that just 1.1% planned to major in computer science, down from 3.7% in 2000...

Yet computer science advocates say that unless the government enacts sweeping legislation aimed at improving the nation's technology competitiveness -- legislation now bogged down in Congress -- there's a limit to what can be done. ``The attitude in the House is very toxic, and I don't see much chance of them coming together,'' says Deborah L. Wince-Smith, president of the Council on Competitiveness.

While Congress was fiddling, the kids from Saratov State were marching toward victory in San Antonio. The 83 teams sat at tables that were gradually festooned with color-coded balloons signaling which group had solved which problems. After an announcer ticked off the last 10 seconds in the contest, Saratov's players, coaches, and hangers-on shouted with joy and gave each other back-pounding bear hugs. ``I feel euphoric,'' said team member Ivan Romanov. Victory was especially sweet, he added, because it came on the anniversary of cosmonaut Yuri Gagarin's 1961 voyage into space.

Gagarin's rocket ride shocked Americans out of their postwar complacency, sparking a national quest for tech superiority that led to such breakthroughs as the moon landing and the microchip. A trouncing in a programming contest doesn't inspire the same kind of response today. Truthfully, Americans just don't feel threatened enough to exert the effort. But if we wait too long, we might find ourselves playing catch-up again.
Thanks to Cameron Wilson for the pointer. See also the USACM blog.

Labels: , ,


Wednesday, April 12, 2006

Gas prices: Response to a chain letter

I received a chain letter from a good friend that made two not-so-good suggestions:
1) Mail a copy of this letter to 10 friends, and
2) Bring gas prices down by boycotting EXXON and MOBIL for the rest of the year.
Here's the idea: For the rest of this year, DON'T purchase ANY gasoline from the two biggest companies (which now are one), EXXON and MOBIL. If they are not selling any gas, they will be inclined to reduce their prices. If they reduce their prices, the other companies will have to follow suit. But to have an impact, we need to reach literally millions of Exxon and Mobil gas buyers.
My response:
I've generally stopped responding to chain letters of any sort, but this one actually needs a response.

Yes, gas at $3.00/gallon is ridiculous. In Europe, you'd pay about double that. If you adjust for inflation, $3.00 is actually less than we were paying for gas 30 years ago.

What is needed--for all kinds of reasons, including global warming and the fact that we are within about a generation of using all the oil that can profitably be extracted and sold at $10/gallon--is to STOP USING SO MUCH GAS.

We can do it the same way most of the rest of the world does: Raise the gasoline tax. (I'm actually old enough to remember buying gas when more than half of what I paid went to Federal and California taxes; they just haven't kept pace with inflation.) If the tax were, say, $3.00/gallon, or 100% of the sales price, or some such, people actually would stop using so much of it soon enough to do some good--by insisting on more fuel-efficient cars, by driving less, by carpooling, by taking public transit, etc. And it would shave a noticeable bit off the national deficit at the same time.
It wouldn't hurt to tax the excess profits of the oil companies, either.

There are people who don't care what kind of world or national deficit their grandkids inherit. I don't have any grandkids, but I care about yours.

Labels: ,


Monday, April 10, 2006

We're still losing the women!

Yet another depressing post on the remarkable drop-off of women entering computing, in both absolute and relative terms. This one is in the CRA Bulletin.
Computer science has the dubious distinction of being the only science field to see a fall in the share of its bachelor’s degrees granted to women between 1983 and 2002. Among all S&E fields tracked by the NSF, linguistics was the only other field to see its share of women drop–but it is a field where the majority of degrees (71 percent) are granted to women.

Between 1983 and 2002, the share of CS bachelor’s degrees awarded to women dropped from 36 to 27 percent. The number of female degree recipients grew by 50 percent in that period, and in 2002 numbered 13,504. Nevertheless, this was lower than the 15,126 degrees granted to women in 1984, during the last boom in degree production.
Comes with a graphic illustration.



Thursday, April 06, 2006

Book: Software Configuration Management Using Vesta

In a previous life I was a small part of a project called Vesta that developed a fairly radical approach to Software Configuration Management. Unfortunately, although Vesta has a lot of theoretical and practical advantages, it has proved difficult to get people to consider using it, because "not enough people are using it" (only a few hundred).

Vesta is available open-source for Linux at under the LGPL.

I have just received a copy of a new book about Vesta by its principal developers.

I hope that this book will stimulate some fresh consideration of using Vesta on projects where reliability and repeatability of software construction are important. I attach some excerpts from the Introduction and Conclusions sections, to give the flavor and perhaps entice more people into looking at the book.

Vesta embodies the belief that reliable, incremental, consistent building is overwhelmingly important for software construction and that its absence from conventional development environments has significantly interfered with the production of large systems. Consequently, Vesta focuses on the two central challenges of large-scale software development -- versioning and building -- and offers a novel, integrated solution.

Versioning is an inevitable problem for large-scale software systems because software evolves and changes substantially over time. Major differences often exist between the source code in various shipped versions of a software product, as well as between the latest shipped version and the current sources under development, yet bugs have to be fixed in all these versions. Also, although many developers may work on the current sources at the same time, each needs the ability to test individual changes in isolation from changes made by others. Thus a powerful versioning system is essential so that developers can create, name, track, and control many versions of the sources.

Building is also a major problem. Without some form of automated support, the task of compiling or otherwise processing source files and combining them into a finished system is time-consuming, error-prone, and likely to produce inconsistent results. As a software system grows, this task becomes increasingly difficult to manage, and comprehensive automation becomes essential. Every organization with a multi-million line code base wants an automated build system that is reliable, efficient, easy-to-use, and general enough for their application. These organizations are very often dissatisfied with the build systems available to them and are forced to distort their development processes to cope with the limitations of their software-building machinery. …

Further, in contrast to most conventional SCM systems, Vesta takes the view that [versioning and building] interact, and that a proper solution integrates them so that the versioning and building facilities leverage each other's properties. …

Vesta subdivides the general problem of versioning into version management and source control. Building breaks down into system modeling and model evaluation. ...

Conventional build systems typically do not require and therefore rarely have comprehensive building instructions. Instead, they depend on the environment, which might comprise files on the developer's workstation and/or well-known server directories, to supply the unspecified pieces. This partial specification prevents repeatable builds. The first step toward achieving repeatability is to store source files and build tools immutably and immortally, as Vesta does, so that they are available when needed. The second step is to ensure that the building instructions are complete, recording precisely which versions of which source files went into a build, which versions of tools (such as the compiler) were used, which command-line switches were supplied to those tools, and all other relevant aspects of the building environment. Vesta's system models do precisely that. …

By following those instructions to the letter, the builder performs in effect a scratch build of the system. Completeness of the instructions makes the build repeatable, but for practicality it must also be incremental. Incrementality means skipping some build actions and using previously computed results instead, an optimization that risks inconsistency. To ensure that an incremental build is consistent, the Vesta builder records every dependency of every derived file on the environment in which it was built. …
The Vesta system's objective: to be a software configuration management system that scales to accommodate large software, is easy to use, and produces repeatable, incremental, and consistent builds. …

To summarize briefly, Vesta
● preserves source code immutably and immortally,
● supports both simple linear versioning and arbitrarily complex branching for parallel development,
● makes all versions directly accessible through the file system,
● provides very fast check-out and check-in using copy-on-write,
● supports distributed development with source replication, cross-repository check-out and check-in, and cross-realm access control,
● manages storage for source and derived files largely automatically,
● provides a flexible, general description language for the precise description and modular organization of software system construction,
● enables integration of new build tools within the description language without modification of either the tools or the Vesta system,
● builds software configurations repeatably, incrementally, and consistently, and
● runs as fast as Make for scratch builds and outperforms it for incremental ones. …

Vesta has been in daily use by a major engineering group since 1999. This group has evolved over the years, but it started out as DEC/Compaq's Araña group, which at the time it began using Vesta was a team of about 130 developers working on a large microprocessor design. The team was organized as two subgroups, one in New England and the other in California, each with its own Vesta repository. Both the chip design itself and the team's custom design software were stored in the repositories and developed using Vesta's suite of tools. The final code base consisted of about 700,000 lines. …

Overall, the Araña group found Vesta a substantial improvement over their previous build tools. Vesta's strong support for parallel source development and repeatable builds saved them considerable time (3 to 6 months in the architectural design phase alone), and the distributed developmental features provided answers to some extremely difficult problems they faced in bicoastal software and design database management. They also found Vesta's repeatability and consistency guarantees to be extremely useful for tracking down difficult bugs, a characteristic that gained in importance as they approached completion of the chip design.

After the DEC/Compaq Alpha division was transferred to Intel, the former Araña group continued to use Vesta on new projects. As of February, 2005, the number of developers using Vesta had grown to over 350, and the system being built had exceeded three million source lines. …

Collectively, the group at Intel routinely runs 20 simultaneous builds, each performing up to 10 tool invocations in parallel. Thus, 200 separate tools (mostly on distinct machines) are simultaneously competing for the repository's services. ...