Tuesday, October 25, 2005

Electronic Voting Not Yet Secure

Congress has released a 107-page report on e-voting by the Government Accountability Office, prepared at the request of Congress, "Federal Efforts to Improve Security and Reliability of Electronic Voting Systems Are Under Way, but Key Activities Need to Be Completed."
While electronic voting systems hold promise for improving the election process, numerous entities have raised concerns about their security and reliability, citing instances of weak security controls, system design flaws, inadequate system version control, inadequate security testing, incorrect system configuration, poor security management, and vague or incomplete voting system standards...

To help ensure the security and reliability of electronic voting systems, GAO is recommending that EAC define specific tasks, processes, and time frames for improving the national voting systems standards, testing capabilities, and management support available to state and local election officials.

Labels: ,

0 comments

Tuesday, October 18, 2005

You're not paranoid:
Your printer is tracking you!

It seems that at least some color printers sold in the United States print almost invisible little yellow dots on every page that can be read by the Secret Service to determine the date and time when the page was printed, along with what appears to be a serial number for the printer.

Of course, this makes those printers a spectacularly bad choice for counterfeiting money or other financial instruments, but one can imagine other uses for the marks, such as tracking down anonymous whistle-blowers...

Reported by the Electronic Frontier Foundation, and further commented on by Ed Felton.

Labels: ,

0 comments

Sunday, October 16, 2005

New, Improved SEMANTIC Web

Thanks to Brian Randell for a pointer to this image.

0 comments

Let those dopers be

An Op-ed by Norm Stamper in the Los Angeles Times urges an end to the losing war by legalizing pot, coke, meth and other drugs.
SOMETIMES PEOPLE in law enforcement will hear it whispered that I'm a former cop who favors decriminalization of marijuana laws, and they'll approach me the way they might a traitor or snitch. So let me set the record straight.

Yes, I was a cop for 34 years, the last six of which I spent as chief of Seattle's police department.

But no, I don't favor decriminalization. I favor legalization, and not just of pot but of all drugs, including heroin, cocaine, meth, psychotropics, mushrooms and LSD.

Decriminalization, as my colleagues in the drug reform movement hasten to inform me, takes the crime out of using drugs but continues to classify possession and use as a public offense, punishable by fines.

I've never understood why adults shouldn't enjoy the same right to use verboten drugs as they have to suck on a Marlboro or knock back a scotch and water.

Prohibition of alcohol fell flat on its face. The prohibition of other drugs rests on an equally wobbly foundation. Not until we choose to frame responsible drug use — not an oxymoron in my dictionary — as a civil liberty will we be able to recognize the abuse of drugs, including alcohol, for what it is: a medical, not a criminal, matter.

As a cop, I bore witness to the multiple lunacies of the "war on drugs." Lasting far longer than any other of our national conflicts, the drug war has been prosecuted with equal vigor by Republican and Democratic administrations, with one president after another — Nixon, Ford, Carter, Reagan, Bush, Clinton, Bush — delivering sanctimonious sermons, squandering vast sums of taxpayer money and cheerleading law enforcers from the safety of the sidelines.

It's not a stretch to conclude that our draconian approach to drug use is the most injurious domestic policy since slavery.

Labels: ,

0 comments

Thursday, October 13, 2005

National Academies Panel Sounds Alarm On Science Education

The Committee on Prospering in the Global Economy of the 21st Century, was created by the National Academy of Sciences and the National Academy of Engineering at the behest of members of Congress from both parties. It included 20 of the nation's most prominent business leaders, educators and scientists, including three Nobel Prize winners. It has just issued a report, Rising Above The Gathering Storm: Energizing and Employing America for a Brighter Economic Future.

From the press release:
The unmatched vitality of the United States' economy and science and technology enterprise has made this country a world leader for decades, allowing Americans to benefit from a high standard of living and national security. But in a world where advanced knowledge is widespread and low-cost labor is readily available, U.S. advantages in the marketplace and in science and technology have begun to erode. A comprehensive and coordinated federal effort is urgently needed to bolster U.S. competitiveness and pre-eminence in these areas so that the nation will consistently gain from the opportunities offered by rapid globalization...

· For the cost of one chemist or one engineer in the United States, a company can hire about five chemists in China or 11 engineers in India.

· Last year chemical companies shuttered 70 facilities in the United States and have tagged 40 more for closure. Of 120 chemical plants being built around the world with price tags of $1 billion or more, one is in the United States and 50 are in China.

· U.S. 12th-graders recently performed below the international average for 21 countries on a test of general knowledge in mathematics and science. In addition, an advanced mathematics assessment was administered to students in 15 other countries who were taking or had taken advanced math courses, and to U.S. students who were taking or had taken pre-calculus, calculus, or Advanced Placement calculus. Eleven countries outperformed the United States, and four scored similarly. None scored significantly below the United States.

· In 1999 only 41 percent of U.S. eighth-graders had a math teacher who had majored in mathematics at the undergraduate or graduate level or studied the subject for teacher certification -- a figure that was considerably lower than the international average of 71 percent.

· Last year more than 600,000 engineers graduated from institutions of higher education in China. In India, the figure was 350,000. In America, it was about 70,000.

· In 2001 U.S. industry spent more on tort litigation than on research and development.

Without a major push to strengthen the foundations of America's competitiveness, the United States could soon lose its privileged position. The ultimate goal is to create new, high-quality jobs for all citizens by developing new industries that stem from the ideas of exceptional scientists and engineers.
But who's listening?

Labels: ,

0 comments

Wednesday, October 12, 2005

Complexity

Bob Colwell, the 2005 recipient of the IEEE Computer Society/ACM Eckert-Mauchly Award, has a thoughtful "At Random" column in the October 2005 issue of IEEE Computer. [IEEE membership required for free access.]

Substitute "software" for "design" and "programming" for "engineering" (which I believe is a perfectly appropriate substitution), and you have an equally thoughtful essay on the effects of complexity in software and how to deal with them.
When nature is the adversary, all that stands between the engineered product and disaster is the product designers’ foresight, wisdom, and skill...

The real art of engineering, its sine qua non, is in evaluating a proposed design from every angle and vantage point to make sure a design will achieve its goals and prove reliable over its intended lifespan.

When a simulation says a design is working, ask whether the simulation is correct and complete. If a formal proof asserts that some aspect of the design is correct, ask whether the proof itself is trustworthy. What makes you so sure the implementation technology will work as needed? What if the product specification itself has holes or blind spots? What if your product’s buyers like it so much that they begin using it in ways you hadn’t intended—can you anticipate that so the product will gracefully accommodate its new uses?

If the designer knows what she’s doing, the design incorporates existing lore—only the desperate or suicidally naive would attempt a product with no familiar or known-trustworthy components—but it can’t be based only on known components.

The nature of engineering is to never design exactly the same thing twice. Every new design pushes the envelope somewhere: performance, cost, reliability, features, capacity. Inevitably, some aspects of the new design will be outside the existing experience base. That’s the part of engineering you don’t learn in school. And assuming two competing design teams are technically proficient and reasonably well led, it’s what these teams decide to do in these unknown areas that will largely determine which design ultimately triumphs.

So how do you handle the wilderness areas of your design, those places beyond your comfort zone and the safety of your tools and direct experience? I think the answer comes down to how well you handle complexity...

You can’t quantify complexity, but you can feel it. In fact, during a leading-edge design, if anything, complexity feels more real than many of the design’s more quantifiable aspects. Your simulation tools can tell you a microprocessor’s projected die size, but there’s still time before the tapeout. Things happen, and as long as you believe that aspect of the design is on track, then for today it’s mainly a theoretical concern.

Performance and power dissipation feel that same way. But complexity is a monster you can hear breathing right outside your cubicle. It whispers your name during planning meetings, but if you’re not paying attention you may not hear it. Later on, you find yourself at the moment of truth in a project, suddenly realizing that things aren’t right and it’s by no means clear if there is a way to salvage them, let alone what that way might be.

There are some hallmarks to complexity that I’ve noticed over the years. I believe design complexity is a function of the
• number of ideas you must hold in your head simultaneously;
• duration of each of those ideas; and
• cross product of those two things, times the severity of interactions between them...

Complexity makes it a little clearer why project success is so sensitive to unknowns. The very fact that these unknowns are, well, unknown, means that they could inject a wide range of behavior into the design. That uncertainty range increases the number of ideas you must simultaneously consider, and it might also increase the predicted time durations needed.

It doesn’t take much uncertainty to make the complexity-derived behavior range too big to mentally handle. And this is perhaps the most insidious issue of all: When faced with a complex situation, you generally know that you aren’t yet in command of all the necessary details. This in itself isn’t alarming; it takes time to understand what hundreds of people are doing or intend to do. But you can’t leave it like that.

One option is to insist that all of the unknowns be researched and quantified so that your spreadsheet can tell you what to do. But this scheme doesn’t work. Your project doesn’t have enough time or idle engineers to do this much new work; besides, not everything you want to know about your project is knowable, much less quantifiable...

Complex designs are more fragile and lead to more surprises (which are always bad). Complexity leads to longer development schedules; it directly causes design errata; it fosters suboptimal tradeoffs between competing goals; it makes follow-on designs much more difficult; and it’s cumulative, with new designs inheriting all of the complexity of the old and with new complications layered on top.

Increased project complexity also shrinks the available pool of engineers who can help when things go awry. Any competent designer can make easy, straightforward choices, but for truly gnarly situations, only wizards will do. And there are never enough wizards.

Stanford’s John Hennessy once said that it’s always possible to design something so complicated that you can never get it right. As a project leader, are you smart enough to mentally absorb the remaining project unknowns on top of what you already know to the extent necessary to make good decisions on any remaining tradeoffs (some of which haven’t even surfaced yet)?

To some extent, everyone on the design team has this same problem. In considering a design’s overall complexity, you’re making the “smart enough” choice on their behalf as well. If, after deep reflection, you believe you have a practical grasp of the project’s complexities, with enough margin to handle the usual surprises downstream, great. Today will be a good day and you won’t have to think about this again until next week, when you must confront the same issue again. But what if you decide that things seem to have gotten a bit too close to the edge? What do you do then? ...

A design’s complexity must serve a project’s major goals. If your design is complicated but coherent, challenging but understandable, you may have struck a good balance between irreducible complexity and the project’s goals.

Strive to avoid creeping complexity, the kind that arises from unintended interactions among multiple unrelated design decisions. Eschew complicated machinery that isn’t clearly and provably necessary to attain one or more of the major project goals. If you aren’t sure you need some logic in your design, keep asking questions until someone either justifies it or agrees to toss it overboard.

Entropy always drags a project in the direction of increasing complexity; things never get simpler on their own.

Labels:

0 comments

Monday, October 10, 2005

Less is not always more.

An AP report details how hurricane forecasters have been plagued with missing or broken equipment for more than a decade. Yet another symptom of the "wait until it breaks" syndrome.
Forecasters at the National Hurricane Center have struggled for more than a decade to issue accurate storm reports using broken equipment, an overbooked airplane fleet and tight budgets, a newspaper reported Sunday.

Key forecasting equipment used by the center has broken down or been unavailable for nearly half of the 45 hurricanes that have struck land since 1992, The Miami Herald found after an eight-month investigation.

"It's almost like we're forecasting blind," said Pablo Santos, a science officer at the National Weather Service's Miami office, which assists the hurricane center during storms. "We've never really had the equipment to do it."
Full story here.

There are some things we rely on the government to do for us, because no one else can or will. And running them on too tight a budget is often a false economy.

Labels: ,

0 comments

U.S. cybersecurity due for FEMA-like calamity?

An article by Declan McCullagh and Anne Broache on CNET News.com summarizes what many thoughtful observers have been saying for some time: Our critical infrastructure is just as vulnerable to cyber-attack as skyscrapers were to hijackers before 9/11. Many privately wonder why we have not already been attacked.
In the wake of Hurricane Katrina, the Federal Emergency Management Agency has been fending off charges of responding sluggishly to a disaster.

Is the cybersecurity division next?

Like FEMA, the U.S. government's cybersecurity functions were centralized under the Department of Homeland Security during the vast reshuffling that cobbled together 22 federal agencies three years ago.

Auditors had warned months before Hurricane Katrina that FEMA's internal procedures for handling people and equipment dispatched to disasters were lacking. In an unsettling parallel, government auditors have been saying that Homeland Security has failed to live up to its cybersecurity responsibilities and may be "unprepared" for emergencies.

"When you look at the events of Katrina, you kind of have to ask yourself the question, 'Are we ready?'" said Paul Kurtz, president of the Cyber Security Industry Alliance, a public policy and advocacy group. "Are we ready for a large-scale cyberdisruption or attack? I believe the answer is clearly no." ...

More so than FEMA, the department's cybersecurity functions have been plagued by a series of damning reports, accusations of bureaucratic bungling, and a rapid exodus of senior staff that's worrying experts and industry groups. The department is charged with developing a "comprehensive" plan for securing key Internet functions and "providing crisis management in response to attacks"--but it's been more visible through press releases such as one proclaiming October to be "National Cyber Security Awareness Month."

Probably the plainest indication of potential trouble has been the rapid turnover among cybersecurity officials...

"In the previous incarnation, DHS and the Homeland Security Council didn't really know what to do with cyber--it's been a deer-in-the-headlights experience for them," Lewis said. "It's not clear who's even in charge. When you look at all the different committees who assert they have a role in cybersecurity, it's about a dozen. Whenever you have 12 committees in charge, that means no one's in charge." ...

Even before Sept. 11, however, the federal government's cybersecurity efforts were being described as slipshod. In a blistering 108-page report released in early 2001, government auditors said the FBI's National Infrastructure Protection Center had become a bureaucratic backwater that was surprisingly ineffective in pursuing malicious hackers or devising a plan to shield the Internet from attacks...

A May 2005 report by the Government Accountability Office warned that bot networks, criminal gangs, foreign intelligence services, spammers, spyware authors and terrorists were all "emerging" threats that "have been identified by the U.S. intelligence community and others." Even though Homeland Security has 13 responsibilities in this area, it "has not fully addressed any," the GAO said.

Other analyses have said the agency is plagued by incompatible computer systems, and another found that Homeland Security was woefully behind in terms of sharing computer security information with private companies...

But the right tools and funding have to be in place, too, said Ed Lazowska, a computer science professor at the University of Washington. He co-chaired the president's Information Technology Advisory Committee, which published a report in February that was critical of federal cybersecurity efforts.

"DHS has an appropriately large focus on weapons of mass destruction but an inappropriately small focus on critical infrastructure protection, and particularly on cybersecurity," Lazowska said in an e-mail interview.

The department is currently spending roughly $17 million of its $1.3 billion science-and-technology budget on cybersecurity, he said. His committee report calls for a $90 million increase in National Science Foundation funding for cybersecurity research and development.

Until then, Lazowska said, "the nation is applying Band-Aids, rather than developing the inherently more secure information technology that our nation requires."

Labels: , , ,

0 comments

Friday, October 07, 2005

Science Abuse

Scientific American has an interesting review by Boyce Rensberger of The Republican War on Science by Chris Mooney.
Thomas Jefferson would be appalled. More than two centuries after he helped to shape a government based on the idea that reason and technological advancement would propel the new United States into a glorious future, the political party that now controls that government has largely turned its back on science.

Even as the country and the planet face both scientifically complex threats and remarkable technological opportunities, many Republican officeholders reject the most reliable sources of information and analysis available to guide the nation. As inconceivable as it would have been to Jefferson--and as dismaying as it is to growing legions of today's scientists--large swaths of the government in Washington are now in the hands of people who don't know what science is. More ominously, some of those in power may grasp how research works but nonetheless are willing to subvert science's knowledge and expert opinion for short-term political and economic gains.

That is the thesis of The Republican War on Science, by Chris Mooney, one of the few journalists in the country who specialize in the now dangerous intersection of science and politics. His book is a well-researched, closely argued and amply referenced indictment of the right wing's assault on science and scientists. Mooney's chronicle of what he calls "science abuse" begins in the 1970s with Richard Nixon and picks up steam with Ronald Reagan. But both pale in comparison to the current Bush administration...

This naive understanding of science hands the Right a time-tested tactic. It does not claim that business interests or moral values trump the scientific consensus. Rather rightists argue that the consensus itself is flawed. Then they encourage a debate between the consensus and the extremist naysayers, giving the two apparently equal weight. Thus, Mooney argues, it seems reasonable to split the difference or simply to argue that there is too much uncertainty to, say, ban a suspect chemical or fund a controversial form of research.

Labels: ,

0 comments

The sky really is falling

COMPUTERWORLD has a good interview with Ed Lazowska, co-chairman of the last President's Information Technology Advisory Council.
Under Lazowska's leadership, PITAC studied three issues: IT for health care, the future of computational science and cybersecurity. PITAC's report on cybersecurity, called "Cyber Security: A Crisis of Prioritization," was published in February.

"The title nicely summarizes our findings," Lazowska says. "There is a crisis, and it is due to a failure to adequately prioritize this issue--a failure by CIOs, and a failure by the federal government."

Lazowska doesn't pull any punches when discussing the Bush administration's approach to the issue.

"In my opinion," he says, "this administration does not value science, engineering, advanced education and research as much as it should -- as much as the future health of the nation requires." ...

"There is a big gap between what we already know about cybersecurity and our deployment of technologies and processes to improve it. That's a CIO problem. There's also a big gap between what we already know about cybersecurity and what we need to know in order to engineer adequately secure systems for the long-term future. That's a federal government problem, because the federal government is responsible for R&D that looks out more than one product cycle--R&D such as engineering a more secure version of the Internet." ...

"We see some of the effects of cybervulnerabilities on a daily basis on the front page of our newspapers: phishing attacks, pharming attacks, denial-of-service attacks and large-scale disclosure of credit card information. Even phishing attacks, which seem easy to dismiss as a gullibility problem, arise from the basic design of the protocols we use today, which make it impossible to determine the source of a network communication with certainty.

"The public, and most CIOs, do not see many activities that are even more threatening. The nation's IT infrastructure is now central to the life of all other elements of the nation's critical infrastructure: the electric power grid, the air traffic control network, the financial system and so on. If you wanted to go after the electric power grid -- even the physical elements of the electric power grid -- then a cyberattack would surely be the most effective method. It's also worth noting that the vast majority of the military's hardware and software comes from commercial vendors. PITAC was told that 85% of the computing equipment used in Iraq was straight commercial. So the military itself is arguably about as vulnerable to a cyberattack as the civilian sector."

Labels: ,

0 comments