Archive for the 'OTA report cited' Category

Congress Lacks Technical Knowledge

Luke Rosiak | The Cutting Edge | June 6, 1012

“High turnover and lack of experience in congressional offices are leaving staffs increasingly without policy and institutional knowledge, a Washington Times analysis of a decade of House and Senate personnel records shows — leaving a vacuum that usually is filled by lobbyists,” according to this blog post.

As policy questions more frequently hinge on the nuances of technical matters, members of Congress are operating without the researchers and topical experts on which they have relied to cast informed votes. With the shuttering of the Office of Technology Assessment, a 200-member congressional support agency that closed in 1995 under House Speaker Newt Gingrich, members who are largely lawyers and rhetorical masters are asked to differentiate between competing proposals that only scientists might be able to evaluate effectively.

The technology office researched and summarized scientific and technological matters, ranging from acid rain to wireless phones, for members who, with an average age of 64 in the Senate and 58 in the House, are legislating on matters such as the Internet, which most spent much of their lives without. Typical of its work products was a decades-ago warning on the effect of technology on copyright law, a question lawmakers contentiously grappled with this year. “It helped us to … better oversee the science and technology programs within the federal establishment,” said then-Rep. Amo Houghton, New York Republican, who served nine terms before retiring in 2005. The role of CRS, which provides research on topics beyond science and technology, has also been rolled back.

OTA published several reports about technology and copyright law, including: Copyright and Home Copying: Technology Challenges the Law, Finding A Balance: Computer Software, Intellectual Property and the Challenge of Technological Change, and Intellectual Property Rights in an Age of Electronics and Information.

Records show that a many congressional staff leave for better paying positions at lobbying firms,  where they prepare policy papers to influence their former colleagues – but with the interests of their new employers in mind, according to the article.

“Staff are incredibly vulnerable to this,” according to Daniel Schuman, a former Congressional Research Service (CRS) lawyer who now studies policy at the nonpartisan Sunlight Foundation.  “They’re trying to do a very complicated job with limited resources.”

Leschine Testifies on Oil Spill

Thomas Leschine | June 9, 2010

Prof. Leschine recently spoke about the disastrous oil spill in the Gulf of Mexico at a hearing of the Energy and Environment Subcommittee of the House Energy and Commerce Committee.

In his testimony, Leschine said that inadequate risk assessment and underfunding of technologies for prevention and response have added to the problem.  Leschine directs the School of Marine Affairs at the College of Environment of the University of Washington.

Massive amounts of dispersants have been injected into the oil plume  with very little understanding about their effect on the environment,  Leschine added.

In his testimony Leschine pointed to an OTA report saying:

In 1990, shortly after the Exxon Valdez spill, the U.S. Office of Technology Assessment prepared at the request of the Congress a Background Paper, Coping with An Oiled Sea: An Analysis of Oil Spill Response Technologies. The report, strongly influenced by events then still unfolding in Prince William Sound, warned that future spills could easily overwhelm the technologies we had. It also cautioned that we can’t prepare for every contingency. The risk will never be zero. It found that industry had focused its efforts on preparing for small, relatively easily controllable spills in harbors and sheltered areas, and that it had likely oversold its ability to respond to major spills. Major spills in open water had up to that point seen recovery rates of no more than 10% of oil spilled, 6-8% in the case of Exxon Valdez, despite billions spent on response. I believe that this picture has not changed much today.

The OTA report found that the relative rarity of major spills was a major impediment to a sustained effort that would yield a higher-impact technology development program. The good news, perhaps, it also found the problem to be less a matter of needing dramatic engineering breakthroughs and more one requiring simply good engineering and sustained attention. It highlighted the need for good design and maintenance, training in deployment and use, and pre-positioning of response equipment in adequate quantities and types to deal with the really big events, like now. The report focused on technology to be sure, but also on decision-making, logistics, and training. Soft technologies, in other words.

In my view, OTA’s findings remain largely valid today, twenty years later. In many ways we are better prepared, but progress has been in fits and starts, issue attention cycle at work in my view. A robust approach to filling the tool kit, with the right hard and soft technologies, is needed.

Coping with Large Oil Spills

Fabius Maximus | May 17, 2010

This blog post,  About the long term effect of giant oil spills, says that past large oil spill have had few long-term effects. It provides a bit of  history about  oil spills saying, “Hundreds of tankers and oilers were sunk during WWII — 333 identified in the Pacific.  Many burned or spilled their oil when sunk.  Many remain on the seabed still loaded with crude oil or oil products.”

Also discussed is IXTOC I, a well blowout that  occured in 1979,  which spilled between 139 to 428 million gallons of oil into the Gulf of Mexico. The blog provides links  to several documents about  IXTOC I  including a 1990 OTA background paper, Coping with an Oiled Sea, which lists  it as the largest oil spill since 1967.

OTA had been asked to study the issue in response to the 1989 Exon Valdez spill in Prince William Sound, Alaska.  In the foreword of the 1990 paper OTA Director, John H. Gibbons, says:

Cleaning up a discharge of millions of gallons of oil at sea under even moderate environmental conditions is an extraordinary problem. Current national capabilities to respond effectively to such an accident are marginal at best. OTA’s analysis shows that improvements could be made, and that those offering the greatest benefits would not require technological breakthroughs –just good engineering design and testing, skilled maintenance and training, timely access to and availability of the most appropriate and substantial systems, and the means to make rapid, informed decisions. One must understand, however, that even the best national response system will have inherent practical limitations that will hinder spill response efforts for catastrophic events– sometimes to a major extent. For that reason it is important to pay at least equal attention to preventive measures as to response systems. In this area, the proverbial ounce of prevention is worth many, many pounds of cure.

How Scientific is Modern Medicine?

Dana Ullman | Huffington Post | April 20, 2010

Scientific justification for medical treatments  is an ideal, or perhaps a marketing tool,  not a reality, according to this blog:

Doctors like to point to the “impressive” efficacy of their treatments in real serious diseases, like cancer, and doctors (and drug companies) are emphatic about asserting that anyone or any company that says (or even suggests) that they have a treatment that might help people with cancer are “quacks.” However, do they maintain this same standard when evaluating their own treatments?

The British Medical Journal and a report by OTA found little evidence to support common medical treatments, according to the blog.

The OTA report referred to was “Assessing the Efficacy and Safety of Medical Technologies” (1978). One  statement from that report has been quoted in many publications:  “It has been estimated that only 10 to 20 percent of all procedures currently used in medical practice have been shown to be efficacious by controlled trial”.  However, the last few words of that quote are often omitted.

In the report OTA points out that modern methods complement the older techniques of evaluating  medical technologies:

Traditionally, clinical experience, based on informal estimation techniques, has been the most important. Other techniques, such as epidemiological studies, formal consensus development,and randomized controlled clinical trials, however, are being used increasingly. The last technique, especially, has gained prominence (in the past 20 years) as a tool for assessing efficacy and safety.

OTA wasn’t asking  that treatments by “quacks”  be held to the same low standard as more traditional doctoring.  Their emphasis was on getting better data overall.  In the report, OTA says:

Given the shortcomings in current assessment systems, the examples of technologies that entered widespread use and were shown later to be inefficacious or unsafe, and the large numbers of inadequately assessed current and emerging technologies, improvements are critically needed in the information base regarding safety and efficacy and the processes for its generation.

Punditry Contestant Recommends OTA

Marisa Katz | Washington Post | October 30, 2009

The Washington Post is sponsoring “America’s Next Great Pundit Contest.” The Post received 4,800 entries from people who hoped to write better commentary than they had been reading.   The Post selected ten entries to move to the next level of the competition. The winner of the contest will be  hired to write a weekly column.

Among the ten finalists was the Nobel Prize -winning physicist, Burton Richter,  who opined about  the need for Congress to  re-establish the Office of Technology Assessment. He pointed out that a 1974 OTA report, “Drug Bioequivalence,” is relevant in recent discussions of health care costs.  He also recommended  one of his favorite OTA reports, “Renewing Our Energy Future,” which discussed the potential of secondary sources for biofuels.

According to Richter, “A new OTA will not settle all the arguments because there are political dimensions to major technical issues, but at least it can help Congress arrive at a common starting point for complicated legislation.”

Kevin Drum of Mother Jones News kindly provided a  summary of the columns at “Pundit Watch” for those wishing to save a little time.

Seismologists monitor North Korea’s nuclear blasts

Dan Vergano | USA Today | May 29, 2009

A column about measuring the size of underground nuclear blasts by their seismic waves refers to a 1988 OTA report, “Seismic Verification of Nuclear Testing Treaties,” that suggested that it might be difficult to detect a nucleat test smaller than 5 kilotons.

In the two decades since that report, verification has improved and now smaller blasts can be detected, the article says.