Ralph Nader sure knows how to marginalize his candidacy for president: He picked a brooding, far-left running mate who couldn't get elected mayor of San Francisco.It's Ralph Nader who really needs to wrap up and close down.
Perhaps Nader sees a younger version of himself in 42-year-old Matt Gonzalez. Each wields a keen intellect, a certitude about his standing in the political struggle between good and evil, and a certain obliviousness to how his ego can get in the way of the cause.
Nader expresses not the least bit of remorse that his 2000 run on the Green Party ticket helped tilt Florida, and thus the presidency, to George W. Bush, in an excruciatingly close race. And he is undeterred by the prospect that he could be a spoiler again. If Democrats can't win by a landslide in 2008, Nader has suggested, they might as well "wrap up, close down, emerge in a different form."
"Wrap up, close down" is a pretty fair description of what happened to the progressive movement that Gonzalez energized in his 2003 race against Gavin Newsom...
Thursday, February 28, 2008
Congressional leaders on Thursday questioned the Department of Homeland Security's past and present efforts to secure the government's networks and dismissed its new plan to improve security as inadequate and behind the times.2/29/08 update: See also this article by Andy Greenberg in Forbes.
"It's hard to believe that this administration believes it has the answers to securing our networks and critical infrastructure," said Rep. Bennie Thompson (D-Miss.) during an often contentious hearing on President Bush's so-called Cyber Initiative before the House Committee on Homeland Security Thursday morning. "I have enormous questions about this initiative. Thus far, I have been extremely disappointed in this administration's efforts in cybersecurity."
Several committee members, including Thompson, Rep. Jane Harman (D-Calif.) and Rep. Bob Etheridge (D-N.C.), were surprised by how little information DHS and other agencies involved in cybersecurity share with each other about current threats, past attacks and other critical issues.
"I have been sitting here with my mouth open. This hearing reminds me of the FEMA trailers. The fact that you don't have threat information is shocking," Harman said. "We are not being serious about our response to threats. How is that we're going to have in real time a response to a significant threat? I just don't see it."
The United States incarcerates more people than any other country in the world and for the first time in the nation's history, more than one in every 100 American adults is confined in a prison or jail, according to a report released on Thursday.Not a world record to be proud of.
The report by the Pew Center on the States said the American penal system held more than 2.3 million adults at the start of the year.
The far more populous nation of China ranked second with 1.5 million behind bars, with Russia a distant third with 890,000 inmates.
"Beyond the sheer number of inmates, America also is the global leader in the rate at which it incarcerates its citizenry, outpacing nations like South Africa and Iran," according to the report.
The Iraq war has cost the US 50-60 times more than the Bush administration predicted and was a central cause of the sub-prime banking crisis threatening the world economy, according to Nobel Prize-winning economist Joseph Stiglitz.Think of all the useful and productive things that could have been done in America with an additional $10,000 per person. Health care, education, infrastructure, research, elimination of hunger, energy independence, ...
The former World Bank vice-president yesterday said the war had, so far, cost the US something like $US3trillion ($3.3 trillion) compared with the $US50-$US60-billion predicted in 2003.
Details and documentation in the book. Reviews.
Introduction and overview.
Call for papers.
Extended abstracts due March 30.
I'm on the (large) Program Committee.
Updated March 21, 2008 to reflect extended submission deadline.
IEEE's The Risk Factor has a good overview and summary.
If you are unlucky enough to get on this list, it's probably easier to stop traveling than to get off the list. And safer.
Wednesday, February 27, 2008
A former White House technology manager told the committee in statements released yesterday that the Bush administration's e-mail system "was primitive and the risk that data would be lost was high."
Steven McDevitt, who left the White House in 2006, said he supervised an internal study that found hundreds of days in which no electronic messages were stored for one or more White House offices from January 2003 to August 2005. The study stated a range when tallying the total number of days in which an office had no recorded e-mails, from 473 -- which had been previously reported -- to more than 1,000, McDevitt said.
McDevitt also said security was so lax that e-mail could be modified by anyone on the computer network until the middle of 2005.
Tuesday, February 26, 2008
Monday, February 25, 2008
Best Buy has revealed that an unknown number of digital photo frames it sold recently contained a sophisticated computer virus. There have been many stories about it. The San Francisco Chronicle article by Deborah Gage is pretty good.
"It is a nasty worm that has a great deal of intelligence," said Brian Grayek, who heads product development at Computer Associates, a security vendor that analyzed the Trojan Horse.See also this earlier article by Deborah.
The virus, which Computer Associates calls Mocmex, recognizes and blocks antivirus protection from more than 100 security vendors, as well as the security and firewall built into Microsoft Windows. It downloads files from remote locations and hides files, which it names randomly, on any PC it infects, making itself very difficult to remove. It spreads by hiding itself on photo frames and any other portable storage device that happens to be plugged into an infected PC.
The authors of the new Trojan Horse are well-funded professionals whose malware has "specific designs to capture something and not leave traces," Grayek said. "This would be a nuclear bomb" of malware.
By studying how the code is constructed and how it's propagated, Computer Associates has traced the Trojan to a specific group in China, Grayek said. He would not name the group.
The strength of the malware shows how skilled hackers have become and how serious they are about targeting digital devices, which provide a new frontier for stealing information from vast numbers of unwary PC owners.
Back in the days of floppy disks, cautious PC users always scanned any incoming floppy for viruses. But now we've gotten lazy, and plug in thumb drives, cameras, and picture frames without any thought that they may be carriers of malware. And some of us will pay dearly for this thoughtlessness.
Thursday, February 21, 2008
Today eight colleagues and I are releasing a significant new research result. We show that disk encryption, the standard approach to protecting sensitive data on laptops, can be defeated by relatively simple methods. We demonstrate our methods by using them to defeat three popular disk encryption products: BitLocker, which comes with Windows Vista; FileVault, which comes with MacOS X; and dm-crypt, which is used with Linux...As Mark Twain said, "It ain't what you don't know that gets you into trouble. It's what you know for sure that just ain't so."
The root of the problem lies in an unexpected property of today’s DRAM memories. DRAMs are the main memory chips used to store data while the system is running. Virtually everybody, including experts, will tell you that DRAM contents are lost when you turn off the power. But this isn’t so. Our research shows that data in DRAM actually fades out gradually over a period of seconds to minutes, enabling an attacker to read the full contents of memory by cutting power and then rebooting into a malicious operating system.
Tuesday, February 19, 2008
IEEE Spectrum Online's "The Risk Factor" has a terse summary of a longer article by the Los Angeles Times.
Six years ago, Los Angeles County began using a ballot for nonpartisan voters that had a little-noticed design flaw. Confusion over how to mark the ballot, critics say, caused tens of thousands of votes to go uncounted in three elections between 2002 and 2006.
At the time, election officials knew that some votes were not being counted but saw no need to make changes. After all, the missing votes went unnoticed in the three primary elections and no one complained.
Friday, February 15, 2008
Good summary of its implications by Kelly Jackson Higgins in Dark Reading.
The industry is just one multi-million-dollar corporate data breach away from waking up to the serious and often-silent threat of corrupted DNS resolution servers, says DNS inventor Paul Mockapetris.Surely the Department of Homeland Security is working feverishly to block this threat to the very core of the Internet? Don't bet on it.
Mockapetris--who is also chief scientist and chairman of the board for network naming and address vendor Nominum--says the recent research on corrupted DNS resolution servers by researchers at Georgia Tech and Google demonstrates yet another way the bad guys are attacking DNS to infect users. (See Hacking a New DNS Attack .)
Researchers David Dagon, Chris Lee, and Wenke Lee of Georgia Tech, and Google's Niels Provos, dubbed the new threat "DNS resolution path corruption,” where malicious DNS servers provide false information in order to send users to malicious sites. The researchers officially presented their findings today at the Network and Distributed System Security Symposium (NDSS) in San Diego.
In their study of DNS resolution, they found around 17 million open-recursive DNS servers on the Net, and discovered that about .4 percent, or 68,000 of them, are performing malicious operations by answering DNS queries with false information that sends them to malicious sites. About 2 percent are returning suspicious results, they reported.
“This report demonstrates that people are getting lured out to dark alleyways of the Internet. The actual damage isn’t documented here, but it will be” somewhere when someone loses the first $10 million to $100 million to this type of attack, Mockapetris says.
Tuesday, February 12, 2008
State Farm Insurance
California Reconsideration of Fault Department
P.O. Box 22664
Bakersfield, CA 93390-2664
Re: Claim Number: [redacted]
Policy Number: [redacted]
Date of Accident: January 28, 2008
I respectfully request reconsideration of the claim department’s determination that I am principally responsible for this accident. Claim representative [redacted] stated that this determination was based on his personal belief that the accident would not have happened had I been driving safely; your letter echoes this by saying that it is “because of traveling at an unsafe distance.”
I grant that the stone that I struck was probably stationary at the moment of impact. But I did not cause the accident, nor could I have avoided it by traveling at a different distance.
Let me remind you of the circumstances of the accident. I was returning home from work by my usual route. Although it was after dark, driving conditions were good. At that point Alma Street has two lanes in each direction, plus a center turn lane. The speed limit is 35 mph. I was moving with traffic, neither passing nor being passed, neither crowding nor being crowded.
Without any warning, I spotted a light-colored object on the pavement behind the car in front of me. That car had neither braked nor swerved. I was unable to stop before striking the object and damaging the underside of my car. When I was able to pull over to the side of the road and back off it, I discovered that it was a large, flat-sided rock.
In our discussion, Mr. [redacted] advanced the ludicrous claim that every prudent driver will always maintain enough separation from the car in front to be able to stop before hitting a fixed object that emerges from under it. I.e., about 130 feet for 35 mph. This is advice that I have never encountered before. I do not know where Mr. [redacted] drives, but around Palo Alto, it is difficult for a cautious driver to maintain a separation of even one car-length per ten mph, because other traffic continually cuts in and fills the separating space.
Regardless of what Mr. [redacted] considers prudent, another five or ten car-lengths of separation would not have prevented this accident. The stone was already well clear of the car ahead when I first sighted it. The problem was its lack of visibility. Since the accident, whenever I pass that stone at night I try to judge how far it would have been visible, had I been expecting it. I do not think I could have seen it from more than 100 feet, given the similarity of its color to the light-colored pavement on that stretch, and the glare from oncoming traffic. The stone is still there by the curb; you can easily get an independent assessment of this if you wish.
I am firmly convinced that the person principally responsible for this accident is the one who caused the stone to be in the middle of the street in the first place. I don’t know who this was. I have hearsay evidence from a man who had been following the stone’s scrape marks up Alma, trying to find the car dragging it. He said that he was an apartment manager, and had seen a woman drive over boundary stones at his apartment complex, dragging one of them into the street under her car. The trail of scratches ended at the point of my accident. Perhaps she was driving the car in front of me, and the stone finally worked its way from under her car to be deposited in front of me. I don’t know. But I certainly wasn’t the one who put it there.
The unusual circumstances of this accident make it a poor predictor of my future accidents and claims. Suddenly spotting a 60-pound stone in the middle of their lane in traffic is something most drivers probably don’t experience in a lifetime.
If there is any further information that I can provide to assist you in this decision, please ask.
PS The brick in the picture was added purely for visual scale. It was not on the road and played no role in the accident.
Updated 3/9/08 to add: State Farm replied on March 6, 2008. Note the specificity, understanding, and sympathy of their response to each of my points above. I wonder if I would have received this much care and attention if I had not been a customer continuously for the last 45 years?
Dear James Horning:
I received your letter asking for a reconsideration of our claim staff's decision about the party principally at fault in this accident. In response to your request, I reviewed the at fault determination letter, the supporting documentation, and the information you submitted.
My review indicates the information supports our claim staff's decision.
Please contact your original claim handler should you have any questions regarding their decision or the damages claimed in this accident. California regulations explain that a driver is "principally at fault" if he or she caused 51% or more of the accident.
Thank you for bringing your concerns to my attention.
Team Manager, Reconsideration of Fault Coordinator
State Farm Mutual Automobile Insurance Company
By way of contrast, my repair shop manager tells me that the State Farm Claims Adjuster insisted on inspecting literally every nut and bolt listed on the repair estimate before approving it.
Monday, February 11, 2008
Some time around 313 B.C., the Romans built the first of eleven aqueducts--engineering marvels that would become critical to their capitol and to the influence of the Roman Empire. This first aqueduct was built completely underground for what historians have concluded were three main advantages: first, to conceal and to protect the water supply from enemies; second, to provide an additional level of protection from erosion and pollution; and finally, to be less disruptive to life above ground.The full paper is also available online.
Back then, as now, the perception of risk had a direct correlation to how systems were designed. Over time, a decreased sensitivity to security risk in ancient Rome resulted in design modifications that made the aqueducts more vulnerable to disruption. Roman engineers began to incorporate architectural “advances” into the aqueduct system, adding magnificent arcades with arches and other above-ground structures that advertised Roman greatness.
Unfortunately these structures also made the aqueducts vulnerable to exploitation, because the water supply was no longer protected underground.
Thus, the infrastructure changed from a hidden and purpose-built system into a visible symbol that invading forces found appealing. Eventually those vulnerabilities were exploited by invading German tribes, who damaged the aqueducts, disrupting water supplies...
As the flow of water dwindled, so did the hope of Rome’s ability to repel the foreign invaders. Ironically, the only aqueduct left in commission after these invasions was the Aqua Virgo, which had been built underground...
Given the current risks and vulnerabilities, we feel that the history of the Roman aqueducts--both as they were originally built and as they changed over the centuries--holds great lessons for the security community today.
Lesson 1: Infrastructures are critical to the security of a state and represent a common good...
Lesson 2: Incorporating new technology can introduce vulnerabilities...
Lesson 3: Infrastructures are built to last and are seldom replaced, even when they may need to be...
Lesson 4: Security in the design is directly tied to how designers perceive security risk...
Modern day engineers should learn from this example and draw from it the understanding that design decisions should anticipate changes over time to environmental and system factors, including security.
Perceptions often lag reality, and it can be costly to weigh your options or implement changes only after security threats become too great to ignore. Built-in security is cheaper and more effective than trying to retrofit it after the system has already been placed into operation. Once the last brick has been placed, infrastructure design decisions have been “cast in stone,” and like the aqueducts, are built to last and hence not easily changed or replaced...
I read your piece in the CRA blog, and I must take mild exception. (And not just because you made me feel old. I did not have the luxury of majoring in computer science--there were no departments then.)
When I look back on my undergraduate and M.S. education in physics, it devoted very little time to the outer layers of the onion. It was focused on the core: mechanics, optics, thermodynamics, electricity and magnetism, quantum mechanics, nuclear physics. I believe these are still the heart of the physics curriculum a half-century later. Specialization in the outer layers of the onion was by and large deferred to the latter part of graduate education. Of course, students were made aware of many applications of physics in modern life, but that was not the subject matter of any of our courses.
Most of the "drivers" of things enabled by physics do not need to have a deep or precise understanding of most of its core topics (although everyone should have some understanding of mechanics). Physics departments focused on the education of future physicists, and taught a few service courses to help non-physicists appreciate some of the fundamental principles. Would it be wrong for computer science departments to similarly focus on our core? I recognize that physics is not a major growth area for most universities, but it plays a time-tested role. (And, as the chairman of a major university physics department once reminded me, "The country's always going to need truck drivers.")
Computer appreciation, like music appreciation and driver education, is a worthwhile service course, but should it be the model for computing education?
Anyhow, I'm glad CRA is looking into this whole area, and I'm sure it couldn't have found a better person than Andy van Dam to lead the effort.
Wednesday, February 06, 2008
A US Department of Homeland Security (DHS) bug-fixing scheme has uncovered an average of one security glitch per 1,000 lines of code in 180 widely used open source software projects.Eight thousand bugs in fifty million lines doesn't seem to quite justify the one-per-thousand claim, but it's still a lot of security holes found at a modest price.
The program, called the Open Source Hardening Project, is sponsored by the DHS and carried out by Coverity and Stanford University. Launched in March 2006, the US$300,000 project was initially launched to review the code of 180 open source software projects frequently used by developers of government websites and application developers.
All the software scrutinized was found to have significant numbers of security flaws, Coverity said on Wednesday. Since 2006 the project has helped fix 7,826 open source flaws in 250 projects, out of 50 million lines of code scanned, the company said.
This does not, of course, directly shed any light on the oft-heard claim that open source software has fewer residual bugs than proprietary software.
Monday, February 04, 2008
Several other sources are carrying the news.
Updated 14:33 2/4/08 with ACM links.
See also "A seductively bad idea," and "This time, internet voting is being deployed."