9.23.2009

david walker vids

http://video.google.com/videosearch?rlz=1C1GGLS_enUS291US304&sourceid=chrome&q=David+walker&um=1&ie=UTF-8&ei=0Dm6SoCqBY7w8Qa58umMCg&sa=X&oi=video_result_group&ct=title&resnum=4#

ascent of money

Latest Stories

Subscribe
The Ascent of Money Episode 4: Planet Finance

The Ascent of Money Episode 4: Planet Finance

In the final episode of four-part THE ASCENT OF MONEY series, Niall Ferguson chronicles the spread of good -- and bad -- financial practices across the globe, the meteoric rise of the American real estate market, and the consequences of the subprime mortgage fiasco.

Posted: Jul 27th, 2009  Comments: 30   Views: 16,444   
 (46 votes)
The Ascent of Money Episode 3: Risky Business

The Ascent of Money Episode 3: Risky Business

In "Risky Business," part three of the four-part THE ASCENT OF MONEY, economist and historian Niall Ferguson examines the roots of the insurance industry, natural disasters and risk management, and the history of hedge funds.

Posted: Jul 17th, 2009  Comments: 46   Views: 18,551   
 (63 votes)
The Ascent of Money Episode 2: Bonds of War

The Ascent of Money Episode 2: Bonds of War

In the second episode of the four-hour series THE ASCENT OF MONEY, economist and historian Niall Ferguson explores John Law and his Louisiana territory Ponzi scheme; bond markets that supported warfare in Europe; and the acceleration of globalization with the economic invasion of the Far East.

Posted: Jul 8th, 2009  Comments: 12   Views: 25,628   
 (68 votes)
The Ascent of Money Episode 1: From Bullion to Bubbles

The Ascent of Money Episode 1: From Bullion to Bubbles

In the first episode of the four-hour series THE ASCENT OF MONEY, economist and historian Niall Ferguson documents the roots of money in the conquest of the Americas, from the Incan empire to the Louisiana territory.

Posted: Jul 8th, 2009  Comments: 42   Views: 57,623   
 (117 votes)
Watch the two-hour THE ASCENT OF MONEY

Watch the two-hour THE ASCENT OF MONEY

Watch the two-hour program THE ASCENT OF MONEY online now.

Posted: Jul 8th, 2009  Comments: 341   Views: 186,764   

9.09.2009

patentnerd

patentnerd

Welcome to patentnerd

8.12.2009

MPEP

File E8r7 E8r7Chapter, Appendix Title
BluePDF html Blue Pages
Title PDF html Title Page
FWD PDF html Foreword
TOCPDF html Table of Contents
Intro PDF html Introduction
0100 PDF html 100 Secrecy, Access, National Security, and Foreign Filing
0200PDF html 200 Types, Cross-Noting, and Status of Application
0300PDF html 300 Ownership and Assignment
0400PDF html 400 Representative of Inventor or Owner
0500PDF html 500 Receipt and Handling of Mail and Papers
0600PDF html 600 Parts, Form, and Content of Application
0700PDF html 700 Examination of Applications
0800PDF html 800 Restriction in Applications Filed Under 35 U.S.C. 111; Double Patenting
0900PDF html 900 Prior Art, Classification, and Search
1000PDF html 1000 Matters Decided by Various U.S. Patent and Trademark Office Officials
1100PDF html 1100 Statutory Invention Registration (SIR) and Pre-Grant Publication (PG Pub)
1200PDF html 1200 Appeal
1300 PDF html 1300 Allowance and Issue
1400PDF html 1400 Correction of Patents
1500PDF html 1500 Design Patents
1600PDF html 1600 Plant Patents
1700 PDF html 1700 Miscellaneous
1800 PDF html 1800 Patent Cooperation Treaty
1900PDF html 1900 Protest
2000 PDF html 2000 Duty of Disclosure
2100PDF html 2100 Patentability
2200 PDF html 2200 Citation of Prior Art and Ex Parte Reexamination of Patents
2300PDF html 2300 Interference Proceedings
2400PDF html 2400 Biotechnology
2500 PDF html 2500 Maintenance Fees
2600PDF html 2600 Optional Inter Partes Reexamination
2700PDF html 2700 Patent Terms and Extensions
Ap IPDF html Appendix I - Partial List of Trademarks
Ap IIPDF html Appendix II - List of Decisions Cited
Ap LPDF html Appendix L - Patent Laws
Ap RPDF html Appendix R - Patent Rules
Ap TPDF html Appendix T - Patent Cooperation Treaty
Ap AIPDF html Appendix AI - Administrative Instructions Under The PCT
Ap PPDF html Appendix P - Paris Convention
IndexPDF html Index - Subject Matter Index

8.07.2009

A blueprint for biotech's blues : Article : Nature Biotechnology

A blueprint for biotech's blues : Article : Nature Biotechnology

Editorial


Nature Biotechnology 27, 675 (2009)
doi:10.1038/nbt0809-675

A blueprint for biotech's blues


Abstract

The strategy outlined in the UK's Life Sciences Blueprint is unlikely to address the British biotech sector's woes or help it regain prominence and success.


Introduction

The UK Life Sciences Blueprint announced in July is the result of a six months' consultation between industry and the British government's Office for Life Sciences. The Office is led by the Minister for Science and Innovation, Lord Paul Drayson, the former founder and CEO of one of the UK's biotech success stories, the drug delivery company, PowderJect. Drayson has drawn together in weekly meetings representatives of all the life sciences sectors—biotech, diagnostics, pharmaceuticals and devices—and those parts of government responsible for research, industry and health. But despite the undeniable credentials of those involved, the Blueprint's 50 pages read less like a plan and more like a credo, or perhaps a chant at a séance designed to summon the spirits of past achievements to the service of the desperate present.

Several 'big ideas' emerge from the Blueprint, but none is convincing. The most tangible, perhaps, falls under the banner of "Access to finance and stimulating investment." Acknowledging that early biotech ventures face increasing difficulties in securing venture capital, the UK government's response has been to create the Innovation Investment Fund (IIF). This will provide £150 ($246) million of taxpayers' money as a means of leveraging private capital to build a £1 ($1.6) billion, 10-year fund.

Unfortunately, the IIF will have no, or at best a negligible, effect on the life sciences financing environment for several reasons. First, £1 billion over 10 years is not much money. Second, the money will go not only to life sciences companies but also to clean tech, advanced manufacturing and IT. And third, the intervention may end up putting soft money into companies that probably would not be funded otherwise. That, of course, is the point of the fund.

But the point for investors is that de-risking should be conducted at the level of the business and not at the level of the incoming cash. By encouraging investments in early-stage ventures, the UK government is not correcting a market failure, it is simply ignoring the market reality that too many UK companies start that subsequently fail. Later stage finance; market access hurdles; an ability to collaborate and expand internationally: these are areas where the intervention might be needed, not in the translation of research into companies. IIF is likely to create more of the kind of weak companies that are already part of the problem.

Another big idea in the Blueprint is the 'recognition' of the National Health Service (NHS), Britain's overarching centralized healthcare provider, as a champion of innovation. Many in biotech will be amused by the juxtaposition of the terms 'innovation' and 'NHS', particularly when the UK's National Institute for Health and Clinical Excellence (NICE) is held up as the engine for innovation. NICE is the organization that determines reimbursement for drugs and other treatments in the UK. In effect, it has been responsible for delaying the introduction of several biotech compounds, including Nexavar for liver cancer, Velcade for multiple myeloma and Avastin for colorectal cancer. NICE puts the 'no' in innovation. It has reduced the UK drug bill but at the expense of technical innovation in medicine development.

The solution, according to the Blueprint, is for NICE to introduce an 'Innovation Pass', a way in which selected medicines can bypass the obstructive NICE appraisal. There is a £25 ($41) million budget for a pilot project, and NICE will be involved in developing the eligibility criteria for the Pass. So the big idea in NHS innovation boils down to a temporary measure to undo the harm, a measure overseen by the agency that is responsible for the harm in the first place. The NHS could, indeed, be a powerful resource if used properly: its nation-wide banks of patient samples and detailed health records are world-class assets that could be mobilized now for disease and adverse drug reaction stratification. At present, the primary obstacle is the hidebound attitude of physicians' organizations that view their members as guardians of patient data. A little bit of leadership from government might shift that balance. This is a critical challenge facing the drug industry today; the UK could lead the world if it seized the opportunity.

The final coup de grâce in the Blueprint—its other big idea—is that one of the key transforming actions necessary for UK biotech is to market the UK biotech brand. An industry-led UK Life Science Marketing Strategy Board will accelerate marketing activity in the next months with activities, such as attending trade shows and holding road shows to help the UK "speak with one voice" on the life sciences, tell venture funds about the IIF and create a "UK supercluster."

Creating this supercluster is not a question of forming and stimulating companies; it is a matter of corralling existing companies within a 'Supercluster' brand, a need that arises because the UK is no longer united in industrial policy. Spurred by the European Union tenet of devolution, central government has passed down responsibilities for industrial development to regional governments. Consequently, Scotland, Wales, North East of England, the Midlands, East of England and six other regions all believe they can and do have their own life sciences cluster. And this regional chatter drowns out any overall message about the UK's strengths.

The reality of government intervention in the UK, and perhaps elsewhere in Europe, is that the measures that government can afford to take (and is allowed by anticompetition European law to take) are not going to be effective. The Blueprint is, unfortunately, not only a litany of ineffectiveness; it is a feeble attempt to reverse government-originated harm to the biotech sector by temporarily disabling the measures that hurt the sector in the first place. By piling on fresh 'initiatives', the UK government complicates the bureaucracy that companies and innovators face. Of course, to remove the obstructions altogether would be a better solution, but that would involve admitting government culpability in the first place.

7.15.2009

RAND | Hot Topics | Health Care Reform

RAND | Hot Topics | Health Care Reform

RAND Resources for Health Care Reform

two surgeons review xray

Driven by concerns about the escalating cost of health care and large numbers of uninsured Americans, Congress is considering a series of sweeping changes to health care policy. These changes have the potential to transform health and health care in the United States in several ways, including expanding insurance programs to cover millions of the currently uninsured; changing how care is paid for and how costs are shared among insurers, patients, and government sources; improving quality of care through a range of methods, including measurement tools, financial incentives for providers, and information technologies; and promoting healthier lifestyle and behavioral choices.

For the past 40 years, RAND Health, one of the world's largest private health research groups, has conducted research and analysis on topics that are currently at the center of the health care reform debate. The key elements of this work are described below and include COMPARE (Comprehensive Assessment of Reform Efforts). This resource was created to help provide policymakers and interested parties with a unique way of understanding and evaluating the effects and unintended consequences of various health care reform proposals.

'COMPARE' Provides Global Positioning System for Health Care Policy

doctor examines child

COMPARE is a first-of-its-kind online resource that provides one-stop shopping for objective analysis of health policy issues. COMPARE presents: facts and figures about the current state of the U.S. health care system, focusing on key dimensions of system performance; a description of policy options for changing the health care system; an inventory and the status of the most prominent federal, state, and private health care reform proposals; and an interactive tool that presents the results of microsimulation analyses of the effects of different health care policy options on multiple dimensions of health system performance, including cost, coverage, and outcomes.

Visit COMPARE online

RAND Research: Informing the Health Care Debate

Increasing Access to Health Care

doctor examines child

Access to health care usually refers to the ease with which an individual can obtain needed medical services. In addition to policy options that would increase the availability of insurance coverage—which is linked to improved access to needed care—Congress is considering options for addressing socioeconomic and urban/rural disparities in access, including tax incentives and policy options to address primary care workforce issues. RAND research has examined factors that influence access, the effects of changes in access, and the relationship between access and health for specific populations, including racial and ethnic minorities, people with limited English proficiency, immigrants, children, and veterans.

read moreRead More

Increasing Health Insurance Coverage

woman overwhelmed by finances

In 2007, about 45.7 million people in the U.S. were uninsured. The uninsured come from every income level, age group, employment status, gender, race, ethnicity, and region of the country. Health care coverage protects individuals against the financial risk that might result from unpredictable and expensive health care needs. Individuals who have health care coverage tend to receive more preventive care, are less likely to avoid or delay needed medical care because of cost, and may have better health outcomes than patients without coverage. Increasing the proportion of people with adequate protection from financial risk due to health care expenses is a cornerstone of many health care reform proposals. The most widely discussed options for expanding coverage to the uninsured include employer mandates, individual mandates, refundable tax credits, and expanding Medicaid/SCHIP eligibility. RAND analysts have used microsimulation modeling methods to predict the effects of specific policy scenarios on health care coverage and health spending.

iconRead More

Decreasing Costs and Increasing Quality

medical records

Despite investing $1.7 trillion annually in health care, the U.S. health care system is plagued with inefficiency and poor quality. The need to control costs while also increasing quality of care figures prominently in the health care reform debate. Better information systems could help, as could efforts to pay for quality and outcomes rather than for number of services delivered. RAND analysts have conducted extensive work on quality of care, estimated the costs and benefits of wide-spread adoption of health IT, assessed the effects of pay-for-performance programs, and conducted studies on public reporting of performance information and its effect on performance and patient experience.

iconRead More

Decreasing Health Spending

doctor talks to patient

U.S. health care spending continues to rise rapidly and accounts for an increasing proportion of the gross domestic product. Trends affecting health care spending include price inflation, the number of mix of services used, increases in the population and the aging of the population; the obesity "epidemic," and new technology are significant contributors. Most health care bills are paid by "third party payers" (such as insurance companies, employers, and government programs), and so patients have little incentive to be discerning consumers who will demand high quality services at a lower cost and use less unnecessary care. Increased consumerism is also expected to encourage cost and quality competition among health care providers, resulting in lower prices for services. Drawing on publicly available data, RAND analysts have examined trends in health care spending and have also assessed how policy options such as high deductible health plans and increased cost sharing affect both costs and health outcomes.

iconRead More

Promoting Wellness and Prevention

man with tape measure around his waist

There is widespread consensus that healthy lifestyle choices including diet, exercise, and preventive screenings are vital to a healthy population. While there is also agreement about the negative impact of poor lifestyle choices, and that changes in these habits—either through organized programs or the efforts of individuals—could improve health, there is no consensus about how these changes would affect health spending. Governments, employers and health insurers have considered the use of incentives in public and private health insurance programs to promote behavior change with the expectation of improving health and saving money. RAND investigators have studied the effects of obesity and disease management programs on health and health spending.

iconRead More

Health Care Organization and Capacity

prescription bottles

RAND work in this area has taken a system perspective, examining how organization affects the health system's ability to provide high-quality care and use resources efficiently and effectively. RAND has conducted hundreds of analyses of how changes in health care markets, delivery systems, and financing mechanisms affect patients, providers, insurers, and medical-product manufacturers. For example, this work has extended into such areas as state health care financing initiatives for the uninsured, the impact of prescription drug benefits on health outcomes and costs, and the effect of managed care on utilization and quality of care.

iconRAND Research on Health Care Organization and Capacity (PDF)

5.12.2009

earth humans and climate

Earth - Wikipedia, the free encyclopedia

Home to millions of species,[11] including humans, Earth is the only place in the universe where life is known to exist. The planet formed 4.54 billion years ago,[12][13][14][15] and life appeared on its surface within a billion years.


Human - Wikipedia, the free encyclopedia

A human being, also human or man, is a member of a species of bipedal primates in the family Hominidae (taxonomically Homo sapiensLatin: "wise human" or "knowing human").[2][3] DNA evidence indicates that modern humans originated in east Africa about 200,000 years ago.

Industrial Revolution - Wikipedia, the free encyclopedia

The onset of the Industrial Revolution marked a major turning point in human society; almost every aspect of daily life was eventually influenced in some way. Starting in the latter part of the 18th century there began a transition in parts of Great Britain's previously manual labour and draf.....


Hockey stick controversy - Wikipedia, the free encyclopedia

Hockey stick controversy

From Wikipedia, the free encyclopedia

Jump to: navigation, search
The Hockey stick graph as shown in the 2001 IPCC report. This chart shows the data from Mann et al. 1999. The colored lines are the reconstructed temperatures, and the gray shaded region represents estimated error bars.
Reconstructions of Northern Hemisphere temperatures for the last 1,000 years according to various older articles (bluish lines), newer articles (reddish lines), and instrumental record (black line).

The hockey stick controversy is a dispute over the reconstructed estimates of Northern Hemisphere mean temperature changes over the past millennium,[1] especially the particular reconstruction of Michael E. Mann, Raymond S. Bradley and Malcolm K. Hughes,[2] frequently referred to as the MBH98 reconstruction. The term "hockey stick" was coined by the head of National Oceanic and Atmospheric Administration's (NOAA) Geophysical Fluid Dynamics Laboratory, Jerry Mahlman, to describe the pattern.

Contents

[hide]

[edit] Nature of the dispute

A quasi-global instrumental temperature record exists from approximately 1850; but to construct a millennial-scale record proxies for temperature are required; issues arise over the faithfulness with which these proxies reflect actual temperature change, their geographical coverage, and the statistical methods used to combine them.

The political significance of the scientific controversy over the graph centers on its use as part of the evidence for anthropogenic global warming. The MBH98 reconstruction was prominently featured in the 2001 United Nations Intergovernmental Panel on Climate Change (IPCC) Third Assessment Report (TAR) and as a result has been widely published in the media.

This dispute centered on technical aspects of the methodology and data sets used in creating the MBH98 reconstruction. The issue was originally raised by former mining executive Stephen McIntyre and economist Ross McKitrick. Their criticisms were that Mann et al.'s reconstructed millennial temperature graph (the hockey stick) was an artifact of flawed calculations and serious data defects; in turn, MBH replied that these criticisms were spurious.

The dispute eventually led to an investigation at the behest of U.S. Congress by a panel of scientists convened by the National Research Council (NRC) of the United States National Academy of Sciences to consider reconstructions of the last 2000 years in general; in addition, an investigation was performed at the behest of Congressman Joe Barton by a panel of three statisticians, chaired by Edward Wegman specifically addressing the MBH work. Both the NRC and Wegman teams issued reports in 2006.

The second graph on the right shows the data from MBH98 and from several other climate reconstructions, subsequent to the 1998 reconstruction. Two of the other temperature reconstructions included on the graph are by Mann and co-authors.

There is an ongoing debate about the details of the temperature record and the means of its reconstruction. The debate centers on several discussion points:

  • How well can past temperatures be reconstructed from the data we have?
  • Was the late 20th century the warmest period during the last 1,000 years?
  • Was the Medieval Warm Period observed in the North Atlantic region part of a broader global or hemispheric warming?
  • Are bristlecone and foxtail pine tree rings valid temperature proxies?
  • Without using the bristlecone and foxtail proxies in the reconstruction, does a hockey stick even exist?

[edit] Discussion of the MBH reconstruction

The hockey stick controversy has to a large extent been focussed on Mann and on the MBH98 reconstruction on which he was the lead author. Scientific American magazine described him as the "Man behind the Hockey Stick," referring to this reconstruction of temperatures. The BBC described the "hockey stick" as a term coined for the chart of temperature variation over the last 1,000 years.[3] The chart is relatively flat from the period A.D. 1000 to 1900, indicating that temperatures were relatively stable for this period of time. The flat part forms the stick's "shaft." After 1900, however, temperatures appear to shoot up, forming the hockey stick's "blade." The combination of the two in the chart suggests a recent sharp rise in temperature caused by human activities. The BBC further stated that "The high-profile publication of the data led to the "hockey stick" being used as a key piece of supporting evidence in the Third Assessment Report by the United Nations' Intergovernmental Panel on Climate Change (IPCC) in 2001."[3]

In 2003, Stephen McIntyre and Ross McKitrick published "Corrections to the Mann et al (1998) Proxy Data Base and Northern Hemisphere Average Temperature Series" in the (JCR-unlisted) journal Energy and Environment 14(6) 751-772, raising concerns about their ability to reproduce the results of MBH. The IPCC AR4 reports that "Wahl and Ammann (2007) showed that this was a consequence of differences in the way McIntyre and McKitrick (2003) had implemented the method of Mann et al. (1998) and that the original reconstruction could be closely duplicated using the original proxy data." [4]. In 2004 Mann, Bradley, and Hughes published a corrigendum to their 1998 article, correcting a number of mistakes in the online supplementary information that accompanied their article but leaving the actual results unchanged.

Hans von Storch and colleagues claimed that the method used by Mann et al. probably underestimates the temperature fluctuations in the past by a factor of two or more.[5] However, this conclusion rests at least in part on the reasonableness of the global climate model (GCM) simulation used, which has been questioned;[6][7] Wahl et al. assert errors in the reconstruction technique that von Storch used.[8]. Von Storch's claim implied that MBH98 was less accurate because if there was more variability than originally shown, then Mann's "hockey stick" would look less like a hockey stick and therefore be weaker argument for recent dramatic climate change.

The IPCC AR4 reports that the extent of any such biases in specific reconstructions... is uncertain ... It is very unlikely, however, that any bias would be as large as the factor of two suggested.

Anders Moberg and his Swedish and Russian collaborators have also generated reconstructions with significantly more variability than the reconstructions of Mann et al.[9][10]

After testing the work of Mann et al. (1998), McKitrick commented

"The Mann multiproxy data, when correctly handled, shows the 20th century climate to be unexceptional compared to earlier centuries. This result is fully in line with the borehole evidence. (As an aside, it also turns out to be in line with other studies that are sometimes trotted out in support of the hockey stick, but which, on close inspection, actually imply a MWP as well.)"[11]

In turn, Mann (supported by Tim Osborn, Keith Briffa and Phil Jones of the Climatic Research Unit) has disputed the claims made by McIntyre and McKitrick,[12][13] saying the

"...so-called 'correction' was nothing more than a botched application of the MBH98 procedure, where the authors (MM) removed 80% of the proxy data actually used by MBH98 during the 15th century period... Indeed, the bizarre resulting claim by MM of anomalous 15th century warmth (which falls within the heart of the "Little Ice Age") is at odds with not only the MBH98 reconstruction, but, in fact the roughly dozen other estimates now published that agree with MBH98 within estimated uncertainties...".[14]

On February 12, 2005, Stephen McIntyre and Ross McKitrick published a paper in Geophysical Research Letters that claimed various errors in the methodology of Mann et al. (1998). The paper claimed that the "Hockey Stick" shape was the result of an invalid principal component method.[15] They claimed that using the same steps as Mann et al., they were able to obtain a hockey stick shape as the first principal component in 99 percent of cases even if trendless red noise was used as input.[16] This paper was nominated as a journal highlight by the American Geophysical Union,[17] which publishes GRL, and attracted international attention for its claims to expose flaws in the reconstructions of past climate.[18]. The IPCC AR4 says this paper may have some theoretical foundation, but Wahl and Amman (2006) also show that the impact on the amplitude of the final reconstruction is very small.

Mann has been personally involved in the debate over climate change. In testimony before the U.S. Senate in 2003, he stated:

"It is the consensus of the climate research community that the anomalous warmth of the late 20th century cannot be explained by natural factors, but instead indicates significant anthropogenic, that is human influences... More than a dozen independent research groups have now reconstructed the average temperature of the northern hemisphere in past centuries... The proxy reconstructions, taking into account these uncertainties, indicate that the warming of the northern hemisphere during the late 20th century... is unprecedented over at least the past millennium and it now appears based on peer-reviewed research, probably the past two millennia."

More recently, the National Academy of Sciences considered the matter. On June 22, 2006, the Academy released a pre-publication version of its report Report-Surface Temperature Reconstructions for the Last 2,000 Years,[27] supporting Mann's more general assertion regarding the last decades of the Twentieth Century, but showing less confidence in his assertions regarding individual decades or years, due to the greater uncertainty at that level of precision.

"The basic conclusion of Mann et al. (1998, 1999) was that the late 20th century warmth in the Northern Hemisphere was unprecedented during at least the last 1,000 years. This conclusion has subsequently been supported by an array of evidence that includes ...
    Based on the analyses presented in the original papers by Mann et al. and this newer supporting evidence, the committee finds it plausible that the Northern Hemisphere was warmer during the last few decades of the 20th century than during any comparable period over the preceding millennium. The substantial uncertainties currently present in the quantitative assessment of large-scale surface temperature changes prior to about A.D. 1600 lower our confidence in this conclusion compared to the high level of confidence we place in the Little Ice Age cooling and 20th century warming. Even less confidence can be placed in the original conclusions by Mann et al. (1999) that "the 1990s are likely the warmest decade, and 1998 the warmest year, in at least a millennium" because the uncertainties inherent in temperature reconstructions for individual years and decades are larger than those for longer time periods, and because not all of the available proxies record temperature information on such short timescales." [28]

One point of contention relates to McIntyre's requests for Mann to provide him with the data, methods and source code McIntyre needed to "audit" MBH98.[19] Mann provided some data and then stopped. After a long process - in which the National Science Foundation supported Mann - the code was made publicly available [20]. It happened because Congress investigated after an article in the Wall Street Journal [21] detailed criticisms raised by McIntyre.[22] Congress was especially concerned about Mann's reported refusal to provide data. In June 2005, Congress asked Mann to testify before a special subcommittee. The chairman of the committee (Joe Barton, a prominent global warming skeptic) wrote a letter to Mann requesting he provide his data, including his source code, archives of all data for all of Mann's scientific publications, identities of his present and past scientific collaborators, and details of all funding for any of Mann's ongoing or prior research, including all of the supporting forms and agreements. [21] The American Association for the Advancement of Science viewed this as "a search for some basis on which to discredit these particular scientists and findings, rather than a search for understanding."[23] When Mann complied, all of the data was available for McIntyre. Congress also requested that third party science panels review the criticisms by McIntyre and McKitrick. The Wegman Panel [24] and the National Academy of Sciences [25] both published reports. McIntyre and McKitrick (2005) claim that 7 of their 10 findings in 2003 have been largely confirmed by these reviews. [26] Nature reported it as "Academy affirms hockey-stick graph - But it criticizes the way the controversial climate result was used." [27]

[edit] National Research Council Report

At the request of the U.S. Congress, a special "Committee on Surface Temperature Reconstructions for the Past 2,000 Years" was assembled by the National Research Council's Board on Atmospheric Sciences and Climate. The Committee consisted of 12 scientists from different disciplines and was tasked with explaining the current scientific information on the temperature record for the past two millennia, and identifying the main areas of uncertainty, the principal methodologies used, any problems with these approaches, and how central the debate is to the state of scientific knowledge on global climate change.

The panel published its report in 2006.[28] The report agreed that there were statistical shortcomings in the MBH analysis, but concluded that they were small in effect. The report summarizes its main findings as follows:[29]

  • The instrumentally measured warming of about 0.6 °C (1.1 °F) during the 20th century is also reflected in borehole temperature measurements, the retreat of glaciers, and other observational evidence, and can be simulated with climate models.
  • Large-scale surface temperature reconstructions yield a generally consistent picture of temperature trends during the preceding millennium, including relatively warm conditions centered around A.D. 1000 (identified by some as the "Medieval Warm Period") and a relatively cold period (or "Little Ice Age") centered around 1700. The existence and extent of a Little Ice Age from roughly 1500 to 1850 is supported by a wide variety of evidence including ice cores, tree rings, borehole temperatures, glacier length records, and historical documents. Evidence for regional warmth during medieval times can be found in a diverse but more limited set of records including ice cores, tree rings, marine sediments, and historical sources from Europe and Asia, but the exact timing and duration of warm periods may have varied from region to region, and the magnitude and geographic extent of the warmth are uncertain.
  • It can be said with a high level of confidence that global mean surface temperature was higher during the last few decades of the 20th century than during any comparable period during the preceding four centuries. This statement is justified by the consistency of the evidence from a wide variety of geographically diverse proxies.
  • Less confidence can be placed in large-scale surface temperature reconstructions for the period from A.D. 900 to 1600. Presently available proxy evidence indicates that temperatures at many, but not all, individual locations were higher during the past 25 years than during any period of comparable length since A.D. 900. The uncertainties associated with reconstructing hemispheric mean or global mean temperatures from these data increase substantially backward in time through this period and are not yet fully quantified.
  • Very little confidence can be assigned to statements concerning the hemispheric mean or global mean surface temperature prior to about A.D. 900 because of sparse data coverage and because the uncertainties associated with proxy data and the methods used to analyze and combine them are larger than during more recent time periods.

In response, a group-authored post on RealClimate, of which Mann is one of the contributors, stated, "the panel has found reason to support the key mainstream findings of past research, including points that we have highlighted previously."[30] Similarly, according to Roger A. Pielke, Jr., the National Research Council publication constituted a "near-complete vindication for the work of Mann et al.";[31] Nature (journal) reported it as "Academy affirms hockey-stick graph."[32]

According to Hans von Storch, Eduardo Zorita[33] and Jesus Rouco,[34] reviewing the NAS report on McIntyre's blog Climate Audit, "With respect to methods, the committee is showing reservations concerning the methodology of Mann et al. The committee notes explicitly on pages 91 and 111 that the method has no validation (CE) skill significantly different from zero. In the past, however, it has always been claimed that the method has a significant nonzero validation skill. Methods without a validation skill are usually considered useless."[35] It was noted by their critics, however, that no such statement, explicit or implicit, is present on the two pages cited[36]; the closest the report comes being a statement that "Some recent results reported in Table 1S of Wahl and Ammann (in press) indicate that their reconstruction, which uses the same procedure and full set of proxies used by Mann et al. (1999), gives CE values ranging from 0.103 to -0.215, depending on how far back in time the reconstruction is carried."[37]

However, CE is not the only measure of skill; Mann et al. (1998) used the more traditional "RE" score, which, unlike CE, accounts for the fact that time series change their mean value over time. The statistically significant reconstruction skill in the Mann et al. reconstruction is independently supported in the peer-reviewed literature.[38][39]

[edit] Committee on Energy and Commerce Report (Wegman report)

A team of statisticians led by Edward Wegman, chair of the National Academy of Sciences' (NAS) Committee on Applied and Theoretical Statistics, was assembled at the request of U.S. Rep. Joe Barton and U.S. Rep. Ed Whitfield .[40] The report primarily focused on the statistical analysis used in the MBH paper, and also considered the personal and professional relationships between Mann et al. and other members of the paleoclimate community. Findings presented in this report (commonly known as the "Wegman Report"[41][42]) at a hearing of the subcommittee on oversight and investigations, chaired by Whitfield, included the following:

  • MBH98 and MBH99 were found to be "somewhat obscure and incomplete" and the criticisms by McIntyre and McKitrick were found to be "valid and compelling".
  • The report found that MBH method creates a PC1 statistic dominated by bristlecone and foxtail pine tree ring series (closely related species). However there is evidence in the literature, that the use of the bristlecone pine series as a temperature proxy may not be valid (suppressing "warm period" in the hockey stick handle); and that bristlecones do exhibit CO2-fertilized growth over the last 150 years (enhancing warming in the hockey stick blade).
  • It is noted that there is no evidence that Mann or any of the other authors in paleoclimatology studies have had significant interactions with mainstream statisticians.
  • A social network of authorships in temperature reconstruction of at least 43 authors having direct ties to Mann by virtue of coauthored papers with him is described. The findings from this analysis suggest that authors in the area of paleoclimate studies are closely connected and thus 'independent studies' may not be as independent as they might appear on the surface. Dr. Wegman stated this was a "hypothesis", and "should be taken with a grain of salt". [43]
  • It is important to note the isolation of the paleoclimate community; even though they rely heavily on statistical methods they do not seem to interact with the statistical community. Additionally, the Wegman team judged that the sharing of research materials, data and results was haphazardly and grudgingly done.
  • Overall, the committee believes that Mann's assessments that the decade of the 1990s was the hottest decade of the millennium and that 1998 was the hottest year of the millennium cannot be supported by his analysis.

The Wegman report has itself been criticized on several contentious grounds:

  • The report was not subject to formal peer review [44] [45] At the hearing, Wegman lists 6 people that participated in his own informal peer review process via email after the report was finalized and said they had no objection to the subcommittee submitting it.[43]
  • Dr. Thomas Crowley, Professor of Earth Science System, Duke University, testified at the committee hearing, "The conclusions and recommendations of the Wegman Report have some serious flaws." [43]
  • The result of fixing the alleged errors in the overall reconstruction does not change the general shape of the reconstruction. [46]
  • Similarly, studies that use completely different methodologies also yield very similar reconstructions[46].
  • The social network analysis is not based on meaningful criteria, does not prove a conflict of interest and did not apply at the time of the 1998 and 1999 publications. Such a network of co-authorship is not unusual in narrowly defined areas of science. [47] During the hearing, Wegman defined the social network as peer reviewers that had "actively collaborated with him in writing research papers" and answered that none of his peer reviewers had.[43]
  • Gerald North, chairman of the National Research Council panel that studied the hockey-stick issue and produced the report Surface Temperature Reconstructions for the Last 2,000 Years, stated the politicians at the hearing at which the Wegman report was presented "were twisting the scientific information for their own propaganda purposes. The hearing was not an information gathering operation, but rather a spin machine."[44] In testimony when asked if he disputed the methodology conclusions of Wegman's report, he stated that "No, we don't. We don't disagree with their criticism. In fact, pretty much the same thing is said in our report. But again, just because the claims are made, doesn't mean they are false."[43]
  • Mann has himself said that the report "uncritically parrots claims by two Canadians (an economist and a mineral-exploration consultant) that have already been refuted by several papers in the peer-reviewed literature inexplicably neglected by Barton's 'panel'. These claims were specifically dismissed by the National Academy in their report just weeks ago."[48]

In his opening remarks, Chairman Barton (at the hearing ex officio) commented on the politically charged nature of the entire process, and the level of disagreement on a great many of these issues, in fact:

So I want to thank Dr. Wegman and his colleagues for giving us an unvarnished, flat out non-political report. Now, admittedly, that report is going to be used probably for political purposes but that is not what he did, and I want to thank Dr. North for the work that he did in this document. Now, it is a lot thicker than Dr. Wegman's document, and Dr. North and his colleagues have kind of looked at the same subject and they have come to a somewhat little--they are little bit more, I don't want to use the technical term wishy-washy but they are kind of on both sides of it, but even Dr. North's report says that the absolute basic conclusion in Dr. Mann's work cannot be guaranteed. This report says it is plausible. Lots of things are plausible. Dr. Wegman's report says it is wrong.

Now, what we are going to do after today's hearing, we are going to take Dr. Wegman's report, and if my friends on the Minority want to shop it to their experts, so be it. We are going to put it up there, let everybody who wants to, take a shot at it. Now, my guess is that since Dr. Wegman came into this with no political axe to grind, that it is going to stand up pretty well. If Dr. Mann and his colleagues are right, their conclusion may be right--Dr. Mann's conclusion may be right but you can't verify it from his statistics in his model so if Dr. Mann's conclusion is right, it is incumbent upon him and his colleagues to go back, get the math right, get the data points right, get the modeling right. That is what science is about.[43]

This contention is further illustrated by the first sentence of the subcommittee's ranking minority member Bart Stupak's remarks:

"Thank you, Mr. Chairman. It is a little bewildering to me why the committee is holding its very first hearing on global warming to referee a dispute over a 1999 hockey stick graph of global temperatures for the past millennium." [43]

[edit] Updates

In a letter to Nature on August 10, 2006, Bradley, Hughes and Mann pointed at the original title of their 1998 article: "Northern Hemisphere temperatures during the past millennium: inferences, uncertainties, and limitations"[49][50] and pointed out "more widespread high-resolution data are needed before more confident conclusions can be reached and that the uncertainties were the point of the article." Mann and his colleagues said that it was "hard to imagine how much more explicit" they could have been about the uncertainties surrounding their work and blaming "poor communication by others" for the "subsequent confusion." He has further suggested that the criticisms directed at his statistical methodology are purely political and add nothing new to the scientific debate.[51]

Paleoclimate findings by the IPCC before and after the Hockey Stick Controversy:

Before: 2001 (page 2)[52]

" proxy data for the Northern Hemisphere indicate that the increase in temperature in the 20th century is likely to have been the largest of any century during the past 1,000 years. It is also likely that, in the Northern Hemisphere, the 1990s was the warmest decade and 1998 the warmest year."

After: Current SPM statement from 2007 (page 10)[53]

""Average Northern Hemisphere temperatures during the second half of the 20th century were very likely higher than during any other 50-year period in the last 500 years and likely the highest in at least the past 1300 years. Some recent studies indicate greater variability in Northern Hemisphere temperatures than suggested in the TAR, particularly finding that cooler periods existed in the 12 to 14th, 17th, and 19th centuries. Warmer periods prior to the 20th century are within the uncertainty range given in the TAR."

In May 2007, Hans von Storch reviewed the changes in thought caused by the hockey stick controversy writing:

In October 2004 we were lucky to publish in Science our critique of the 'hockey-stick' reconstruction of the temperature of the last 1000 years. Now, two and half years later, it may be worth reviewing what has happened since then.
At the EGU General Assembly a few weeks ago there were no less than three papers from groups in Copenhagen and Bern assessing critically the merits of methods used to reconstruct historical climate variable from proxies; Bürger's papers in 2005; Moberg's paper in Nature in 2005; various papers on borehole temperature; The National Academy of Science Report from 2006 – all of which have helped to clarify that the hockey-stick methodologies lead indeed to questionable historical reconstructions. The 4th Assessment Report of the IPCC now presents a whole range of historical reconstructions instead of favoring prematurely just one hypothesis as reliable.[54]

McIntyre was critical of this Nature blog entry because von Storch did not acknowledge the role of McIntyre and McKitrick;[55] however von Storch replied[56] that:

This was on purpose, as we do not think that McIntyre has substantially contributed in the published peer-reviewed literature to the debate about the statistical merits of the MBH and related method. They have published one peer-reviewed article on a statistical aspect, and we have published a response – acknowledging that they would have a valid point in principle, but the critique would not matter in the case of the hockey-stick ... we see in principle two scientific inputs of McIntyre into the general debate – one valid point, which is however probably not relevant in this context, and another which has not been properly documented.

As a lot of claims regarding the hockey stick revolve around statistical aspects, the American Statistical Association held a session[57] at the 2006 Joint Statistical Meetings, on climate change with Edward Wegman, John Michael Wallace, and Richard L. Smith[58]. E. Wegman presented the discussion of the methodological aspects of PC analysis by MBH98, and his view that Method Wrong + Answer Correct = Bad Science. J. M. Wallace outlined the NRC report and its cautious conclusion that the claims of unprecedented temperatures in the last decades can be considered as plausible (2:1 odds in favor). R. L. Smith (U. of North Carolina, Statistics) analyzed statistical methodology behind the CCSP "Report on Temperature Trends in the Lower Atmosphere"[59] and shared his vision of the role of statisticians in the process. The session was summarized by R. L. Smith in ASA Section on Statistics and the Environment newsletter[60].

In a paper on 9 September 2008, Mann and colleagues published an updated reconstruction of Earth surface temperature for the past two millennia.[61] This reconstruction used a more diverse dataset that was significantly larger than the original tree-ring study. Similarly to the original study, this work found that recent increases in northern hemisphere surface temperature are anomalous relative to at least the past 1300 years, and that this result is robust to the inclusion or exclusion of the tree-ring dataset.

[edit] References

  1. ^ "Hockey Stick". Realclimate. 2004-11-28. http://www.realclimate.org/index.php?p=16. Retrieved on 2007-05-08. 
  2. ^ Mann, Michael E.; Bradley, Raymond S.; Hughes, Malcolm K. (1998), "Global-scale temperature patterns and climate forcing over the past six centuries" (PDF), Nature 392: 779–787, doi:10.1038/33859, http://www.caenvirothon.com/Resources/Mann,%20et%20al.%20Global%20scale%20temp%20patterns.pdf 
  3. ^ a b "Climate legacy of 'hockey stick'". BBC. 2004-08-16. http://news.bbc.co.uk/1/hi/sci/tech/3569604.stm. Retrieved on 2007-05-08. 
  4. ^ Briffa, Keith R.; Duplessy, Jean-Claude; Joos, Fortunat; Masson-Delmotte, Valérie (2007), "Chapter 6: Paleoclimate", in Pachauri, Rajendra K.; Solomon, Susan; Qin, Dahe et al., Working Group I: The Physical Basis of Climate Change, IPCC, http://ipcc-wg1.ucar.edu/wg1/Report/AR4WG1_Ch06-v2.pdf 
  5. ^ von Storch, Hans; Zorita, Eduardo; Jones, Julie M.; Dimitriev, Yegor; González-Rouco, Fidel; Tett, Simon F. B. (2004), "Reconstructing Past Climate from Noisy Data", Science 306 (5696): 679–682, doi:10.1126/science.1096109, http://w3g.gkss.de/staff/storch/pdf/vonStorch2004science.pdf 
  6. ^ "A Mistake with Repercussions". Realclimate. 2006-04-27. http://www.realclimate.org/index.php/archives/2006/04/a-correction-with-repercussions/. Retrieved on 2007-05-08. 
  7. ^ Mann, Michael E.; Rutherford, Scott; Wahl, Eugene; Amman, Caspar (2005), "Testing the Fidelity of Methods Used in Proxy-Based Reconstructions of Past Climate", Journal of Climate 18 (20): 4097–4105, doi:10.1175/JCLI3564.1, http://www.meteo.psu.edu/~mann/shared/articles/MRWA-JClimate05.pdf 
  8. ^ Wahl, Eugene R.; Ritson, David M.; Ammann, Caspar M. (2006), "Comment on "Reconstructing Past Climate from Noisy Data"", Science 312 (5773): 529, doi:10.1126/science.1120866, http://w3g.gkss.de/staff/storch/pdf/wahl_060428.pdf 
  9. ^ Moberg, Anders; Sonechkin, Dmitry M.; Karlén, Wibjörn (2005), "Highly variable Northern Hemisphere temperatures reconstructed from low- and high-resolution proxy data", Nature 433: 612–617, doi:10.1038/nature03265, http://www.nature.com/nature/journal/v433/n7026/pdf/nature03265.pdf 
  10. ^ Connolley, William M. (2005-02-15). "Moberg et al: Highly variable Northern Hemisphere temperatures?". Realclimate. http://www.realclimate.org/index.php?p=122. Retrieved on 2007-05-08. 
  11. ^ What is the 'Hockey stick' debate about?, by Ross McKitrick'PDF (0.5 MiB)
  12. ^ http://holocene.meteo.psu.edu/Mann/EEReply.html
  13. ^ Mann, Michael E.; Bradley, Raymond S.; Hughes, Malcolm K. (2003), Note on paper by McIntyre and McKitrick in Energy and Environment, http://holocene.meteo.psu.edu/Mann/EandEPaperProblem.pdf 
  14. ^ Michael Mann and Gavin Schmidt on peer reviewing
  15. ^ McIntyre, Stephen; McKitrick, Ross (2005), "Hockey sticks, principal components, and spurious significance" (PDF), Geophysical Research Letters 32: L03710, doi:10.1029/2004GL021750, http://www.climateaudit.org/pdf/mcintyre.grl.2005.pdf 
  16. ^ McIntyre, Steven; Ross McKitrick. "The M&M Project: Replication Analysis of the Mann et al. Hockey Stick". http://www.uoguelph.ca/~rmckitri/research/trc.html. Retrieved on 2007-05-08. 
  17. ^ AGU Journal Highlights 9 March 2005
  18. ^ Wall Street Journal blog
  19. ^ "Mann on Source Code" by Stephen McIntyre[1]
  20. ^ "Title to MBH98 Source Code" by Stephen McIntyre [2]
  21. ^ a b "Letter from Congress to Dr. Mann dated June 23, 2005" [3]
  22. ^ "In Climate Debate, the 'Hockey Stick' Leads to a Face-Off". http://online.wsj.com/public/article/SB110834031507653590-DUadAZBzxH0SiuYH3tOdgUmKXPo_20060207.html?mod=blogs. Retrieved on 2007-04-29. 
  23. ^ Dear Mr
  24. ^ "The Wegman Report" [4]
  25. ^ "Surface Temperature Reconstructions for the last 2,000 years" by National Academy of Science [5]
  26. ^ "A Scorecard on MM03" by McIntyre and McKitrick [6]
  27. ^ Academy affirms hockey-stick graph.
  28. ^ Committee on Surface Temperature Reconstructions for the Past 2,000 Years. Surface Temperature Reconstructions for the Past 2,000 Years. The National Academies Press, Washington, D.C. 2006.
  29. ^ Surface Temperature Reconstructions for the Last 2,000 Years
  30. ^ RealClimate
  31. ^ R. Pielke Jr.: Quick Reaction to the NRC Hockey Stick Report, Prometheus 22 June 2006
  32. ^ Academy affirms hockey-stick graph (paid archive); Nature Volume 441 Number 7097 p. 1032, 28 June 2006. doi:10.1038/4411032a
  33. ^ http://w3g.gkss.de/staff/zorita/
  34. ^ Jesus Fidel Gonzalez Rouco Home Page
  35. ^ Hans von Storch, Eduardo Zorita, Fidel González-Rouco: Press release and comment on the NAS report "Surface Temperature Reconstructions for the last 200 Years"; 22 June 2006
  36. ^ 91 and 111
  37. ^ page 95
  38. ^ Huybers (2005)
  39. ^ Wahl and Ammann (2006)
  40. ^ Blog by Eric Berger
  41. ^ full report
  42. ^ fact-sheet
  43. ^ a b c d e f g subcommittee transcript of hearing
  44. ^ a b North Interview
  45. ^ Duane D. Freese
  46. ^ a b The missing piece at the Wegman hearing
  47. ^ Adventures in social network analysis
  48. ^ 'The Discovery of Global Warming' update
  49. ^ Geophys. Res. Lett. 26, 759–762; 1999
  50. ^ Bradley, Raymond S.; Hughes, Malcolm K.; Mann, Michael E. (2006), "Authors were clear about hockey-stick uncertainties", Nature 442 (7103): 627, doi:10.1038/442627b, http://www.nature.com/doifinder/10.1038/442627b 
  51. ^ War of words over new climate change report, 'hockey stick' model
  52. ^ http://www.ipcc.ch/pub/spm22-01.pdf
  53. ^ IPCC Summary for Policymakers
  54. ^ Hans von Storch and Eduardo Zorita on the Hocky stick effect
  55. ^ Climate Audit - by Steve McIntyre » von Storch and Zorita blog on the Hockey Stick
  56. ^ Hans von Storch
  57. ^ 2006 Joint Statistical Meetings online program
  58. ^ Richard L. Smith
  59. ^ Temperature Trends in the Lower Atmosphere: Steps for Understanding and Reconciling Differences. Thomas R. Karl, Susan J. Hassol, Christopher D. Miller, and William L. Murray, editors, 2006. A Report by the Climate Change Science Program and the Subcommittee on Global Change Research, Washington, DC.
  60. ^ ASA Section on Statistics and the Environment Newsletter, Spring 2007
  61. ^ Mann, M.E.; Zhang, Z., Hughes, M.K., Bradley, R.S., Miller, S.K., Rutherford, S. and Ni, F. (2008). "Proxy-based reconstructions of hemispheric and global surface temperature variations over the past two millennia". PNAS 105: 132520–13257. doi:10.1073/pnas.0805721105. 

[edit] External links

Ongoing updates related to the MBH work are accessible in two weblogs:



Scientific method - Wikipedia, the free encyclopedia

Scientific method

From Wikipedia, the free encyclopedia

Jump to: navigation, search
This page is semi-protected.
Editing of this article by new or unregistered users is currently disabled due to vandalism.
See the protection policy and protection log for more details. If you cannot edit this article and you wish to make a change, you can request an edit, discuss changes on the talk page, request unprotection, log in, or create an account.
Part of a series on:
Science
v  d  e

Scientific method refers to bodies of techniques for investigating phenomena, acquiring new knowledge, or correcting and integrating previous knowledge. To be termed scientific, a method of inquiry must be based on gathering observable, empirical and measurable evidence subject to specific principles of reasoning.[1] A scientific method consists of the collection of data through observation and experimentation, and the formulation and testing of hypotheses.[2]

Although procedures vary from one field of inquiry to another, identifiable features distinguish scientific inquiry from other methodologies of knowledge. Scientific researchers propose hypotheses as explanations of phenomena, and design experimental studies to test these hypotheses. These steps must be repeatable in order to dependably predict any future results. Theories that encompass wider domains of inquiry may bind many hypotheses together in a coherent structure. This in turn may help form new hypotheses or place groups of hypotheses into context.

Among other facets shared by the various fields of inquiry is the conviction that the process be objective to reduce a biased interpretation of the results. Another basic expectation is to document, archive and share all data and methodology so they are available for careful scrutiny by other scientists, thereby allowing other researchers the opportunity to verify results by attempting to reproduce them. This practice, called full disclosure, also allows statistical measures of the reliability of these data to be established.

Contents

[hide]

Introduction to scientific method

Ibn al-Haytham (Alhazen), 965–1039, Basra.

Since Ibn al-Haytham (Alhazen, 965–1039), one of the key figures in developing scientific method, the emphasis has been on seeking truth:

"Truth is sought for its own sake. And those who are engaged upon the quest for anything for its own sake are not interested in other things. Finding the truth is difficult, and the road to it is rough."[3]

"How does light travel through transparent bodies? Light travels through transparent bodies in straight lines only.... We have explained this exhaustively in our Book of Optics. But let us now mention something to prove this convincingly: the fact that light travels in straight lines is clearly observed in the lights which enter into dark rooms through holes.... [T]he entering light will be clearly observable in the dust which fills the air."[4]

Alhazen in Book of Optics (1021): light travels in straight lines.

The conjecture that "light travels through transparent bodies in straight lines only" was corroborated by Alhazen only after years of effort. His demonstration of the conjecture was to place a straight stick or a taut thread next to the light beam,[5] to prove that light travels in a straight line.

Scientific methodology has been practiced in some form for at least one thousand years. There are difficulties in a formulaic statement of method, however. As William Whewell (1794–1866) noted in his History of Inductive Science (1837) and in Philosophy of Inductive Science (1840), "invention, sagacity, genius" are required at every step in scientific method. It is not enough to base scientific method on experience alone[6]; multiple steps are needed in scientific method, ranging from our experience to our imagination, back and forth.

In the twentieth century, a hypothetico-deductive model for scientific method was formulated (for a more formal discussion, see below):

1. Use your experience: Consider the problem and try to make sense of it. Look for previous explanations. If this is a new problem to you, then move to step 2.
2. Form a conjecture: When nothing else is yet known, try to state an explanation, to someone else, or to your notebook.
3. Deduce a prediction from that explanation: If you assume 2 is true, what consequences follow?
4. Test: Look for the opposite of each consequence in order to disprove 2. It is a logical error to seek 3 directly as proof of 2. This error is called affirming the consequent.

This model underlies the scientific revolution. One thousand years ago, Alhazen demonstrated the importance of steps 1 and 4. Galileo (1638) also showed the importance of step 4 (also called Experiment) in Two New Sciences. One possible sequence in this model would be 1, 2, 3, 4. If the outcome of 4 holds, and 3 is not yet disproven, you may continue with 3, 4, 1, and so forth; but if the outcome of 4 shows 3 to be false, you will have go back to 2 and try to invent a new 2, deduce a new 3, look for 4, and so forth.

Note that this method can never absolutely verify (prove the truth of) 2. It can only falsify 2.[7] (This is what Einstein meant when he said "No amount of experimentation can ever prove me right; a single experiment can prove me wrong."[8]) However, as pointed out by Carl Hempel (1905-1997) this simple Popperian view of scientific method is incomplete; the formulation of the conjecture might itself be the result of inductive reasoning. Thus the likelihood of the prior observation being true is statistical in nature [9] and would strictly require a Bayesian analysis. To overcome this uncertainty, experimental scientists must formulate a crucial experiment, in order for it to corroborate a more likely hypothesis.

In the twentieth century, Ludwik Fleck (1896–1961) and others found that we need to consider our experiences more carefully, because our experience may be biased, and that we need to be more exact when describing our experiences.[10] These considerations are discussed below.

DNA example
The Keystones of Science project, sponsored by the journal Science, has selected a number of scientific articles from that journal and annotated them, illustrating how different parts of each article embody scientific method. Here is an annotated example of this scientific method example titled Microbial Genes in the Human Genome: Lateral Transfer or Gene Loss?.

Image:DNA icon (25x25).png Each element of scientific method is illustrated below by an example from the discovery of the structure of DNA:

The examples are continued in "Evaluations and iterations" with DNA-iterations.[15]
Flying horse depiction: disproven; see below.

Truth and belief

Main article: Truth

Belief can alter observations; those with a particular belief will often see things as reinforcing their belief, even if they do not.[16] Needham's Science and Civilization in China uses the 'flying horse' image as an example of observation: in it, a horse's legs are depicted as splayed, when the stop-action picture by Eadweard Muybridge shows otherwise. Note that at the moment that no hoof is touching the ground, the horse's legs are gathered together and are not splayed, but for when a horse is jumping.

Earlier paintings depict the incorrect flying horse observation. This demonstrates Ludwik Fleck's caution that people observe what they expect to observe, until shown otherwise; our beliefs will affect our observations (and therefore our subsequent actions). The purpose of the scientific method is to test a hypothesis, a proposed explanation about how things are, via repeatable experimental observations which can contradict the hypothesis so as to fight this observer bias.

Elements of scientific method

There are many ways of outlining the basic method shared by all fields of scientific inquiry. The following examples are typical classifications of the most important components of the method on which there is wide agreement in the scientific community and among philosophers of science. There are, however, disagreements about some aspects.

The following set of methodological elements and organization of procedures tends to be more characteristic of natural sciences than social sciences. In the social sciences mathematical and statistical methods of verification and hypotheses testing may be less stringent. Nonetheless the cycle of hypothesis, verification and formulation of new hypotheses will resemble the cycle described below.


The essential elements[17][18][19] of a scientific method[20] are iterations,[21][22] recursions,[23] interleavings, and orderings of the following:

Each element of a scientific method is subject to peer review for possible mistakes. These activities do not describe all that scientists do (see below) but apply mostly to experimental sciences (e.g., physics, chemistry). The elements above are often taught in the educational system.[30]

Scientific method is not a recipe: it requires intelligence, imagination, and creativity.[31] It is also an ongoing cycle, constantly developing more useful, accurate and comprehensive models and methods. For example, when Einstein developed the Special and General Theories of Relativity, he did not in any way refute or discount Newton's Principia. On the contrary, if the astronomically large, the vanishingly small, and the extremely fast are reduced out from Einstein's theories — all phenomena that Newton could not have observed — Newton's equations remain. Einstein's theories are expansions and refinements of Newton's theories, and observations that increase our confidence in them also increase our confidence in Newton's approximations to them.

A linearized, pragmatic scheme of the four points above is sometimes offered as a guideline for proceeding:[32]

  1. Define the question
  2. Gather information and resources (observe)
  3. Form hypothesis
  4. Perform experiment and collect data
  5. Analyze data
  6. Interpret data and draw conclusions that serve as a starting point for new hypothesis
  7. Publish results
  8. Retest (frequently done by other scientists)

The iterative cycle inherent in this step-by-step methodology goes from point 3 to 6 back to 3 again.

While this schema outlines a typical hypothesis/testing method,[33] it should also be noted that a number of philosophers, historians and sociologists of science (perhaps most notably Paul Feyerabend) claim that such descriptions of scientific method have little relation to the ways science is actually practiced.

The "operational" paradigm combines the concepts of operational definition, instrumentalism, and utility:

The essential elements of a scientific method are operations, observations, models, and a utility function for evaluating models.[34]

  • Operation - Some action done to the system being investigated
  • Observation - What happens when the operation is done to the system

Characterizations

Scientific method depends upon increasingly more sophisticated characterizations of the subjects of investigation. (The subjects can also be called unsolved problems or the unknowns.) For example, Benjamin Franklin correctly characterized St. Elmo's fire as electrical in nature, but it has taken a long series of experiments and theory to establish this. While seeking the pertinent properties of the subjects, this careful thought may also entail some definitions and observations; the observations often demand careful measurements and/or counting.

  • "I am not accustomed to saying anything with certainty after only one or two observations."[35]Andreas Vesalius (1546)

The systematic, careful collection of measurements or counts of relevant quantities is often the critical difference between pseudo-sciences, such as alchemy, and a science, such as chemistry or biology. Scientific measurements taken are usually tabulated, graphed, or mapped, and statistical manipulations, such as correlation and regression, performed on them. The measurements might be made in a controlled setting, such as a laboratory, or made on more or less inaccessible or unmanipulatable objects such as stars or human populations. The measurements often require specialized scientific instruments such as thermometers, spectroscopes, or voltmeters, and the progress of a scientific field is usually intimately tied to their invention and development.

Uncertainty

Measurements in scientific work are also usually accompanied by estimates of their uncertainty. The uncertainty is often estimated by making repeated measurements of the desired quantity. Uncertainties may also be calculated by consideration of the uncertainties of the individual underlying quantities that are used. Counts of things, such as the number of people in a nation at a particular time, may also have an uncertainty due to limitations of the method used. Counts may only represent a sample of desired quantities, with an uncertainty that depends upon the sampling method used and the number of samples taken.

Definition

Measurements demand the use of operational definitions of relevant quantities. That is, a scientific quantity is described or defined by how it is measured, as opposed to some more vague, inexact or "idealized" definition. For example, electrical current, measured in amperes, may be operationally defined in terms of the mass of silver deposited in a certain time on an electrode in an electrochemical device that is described in some detail. The operational definition of a thing often relies on comparisons with standards: the operational definition of "mass" ultimately relies on the use of an artifact, such as a certain kilogram of platinum-iridium kept in a laboratory in France.

The scientific definition of a term sometimes differs substantially from its natural language usage. For example, mass and weight overlap in meaning in common discourse, but have distinct meanings in mechanics. Scientific quantities are often characterized by their units of measure which can later be described in terms of conventional physical units when communicating the work.

New theories sometimes arise upon realizing that certain terms had not previously been sufficiently clearly defined. For example, Albert Einstein's first paper on relativity begins by defining simultaneity and the means for determining length. These ideas were skipped over by Isaac Newton with, "I do not define time, space, place and motion, as being well known to all." Einstein's paper then demonstrates that they (viz., absolute time and length independent of motion) were approximations. Francis Crick cautions us that when characterizing a subject, however, it can be premature to define something when it remains ill-understood.[36] In Crick's study of consciousness, he actually found it easier to study awareness in the visual system, rather than to study free will, for example. His cautionary example was the gene; the gene was much more poorly understood before Watson and Crick's pioneering discovery of the structure of DNA; it would have been counterproductive to spend much time on the definition of the gene, before them.

Example of characterizations

DNA-characterizations

Image:DNA icon (25x25).png The history of the discovery of the structure of DNA is a classic example of the elements of scientific method: in 1950 it was known that genetic inheritance had a mathematical description, starting with the studies of Gregor Mendel. But the mechanism of the gene was unclear. Researchers in Bragg's laboratory at Cambridge University made X-ray diffraction pictures of various molecules, starting with crystals of salt, and proceeding to more complicated substances. Using clues which were painstakingly assembled over the course of decades, beginning with its chemical composition, it was determined that it should be possible to characterize the physical structure of DNA, and the X-ray images would be the vehicle. ..2. DNA-hypotheses

Precession of Mercury
Precession of the perihelion (exaggerated)

The characterization element can require extended and extensive study, even centuries. It took thousands of years of measurements, from the Chaldean, Indian, Persian, Greek, Arabic and European astronomers, to record the motion of planet Earth. Newton was able to condense these measurements into consequences of his laws of motion. But the perihelion of the planet Mercury's orbit exhibits a precession that is not fully explained by Newton's laws of motion. The observed difference for Mercury's precession between Newtonian theory and relativistic theory (approximately 43 arc-seconds per century), was one of the things that occurred to Einstein as a possible early test of his theory of General Relativity.

Hypothesis development

A hypothesis is a suggested explanation of a phenomenon, or alternately a reasoned proposal suggesting a possible correlation between or among a set of phenomena.

Normally hypotheses have the form of a mathematical model. Sometimes, but not always, they can also be formulated as existential statements, stating that some particular instance of the phenomenon being studied has some characteristic and causal explanations, which have the general form of universal statements, stating that every instance of the phenomenon has a particular characteristic.

Scientists are free to use whatever resources they have — their own creativity, ideas from other fields, induction, Bayesian inference, and so on — to imagine possible explanations for a phenomenon under study. Charles Sanders Peirce, borrowing a page from Aristotle (Prior Analytics, 2.25) described the incipient stages of inquiry, instigated by the "irritation of doubt" to venture a plausible guess, as abductive reasoning. The history of science is filled with stories of scientists claiming a "flash of inspiration", or a hunch, which then motivated them to look for evidence to support or refute their idea. Michael Polanyi made such creativity the centerpiece of his discussion of methodology.

William Glen observes that

the success of a hypothesis, or its service to science, lies not simply in its perceived "truth", or power to displace, subsume or reduce a predecessor idea, but perhaps more in its ability to stimulate the research that will illuminate … bald suppositions and areas of vagueness.[37]

In general scientists tend to look for theories that are "elegant" or "beautiful". In contrast to the usual English use of these terms, they here refer to a theory in accordance with the known facts, which is nevertheless relatively simple and easy to handle. Occam's Razor serves as a rule of thumb for making these determinations.

DNA-hypotheses

Image:DNA icon (25x25).png Linus Pauling proposed that DNA might be a triple helix. Francis Crick and James Watson learned of Pauling's hypothesis, understood from existing data that Pauling was wrong and realized that Pauling would soon realize his mistake. So, the race was on to figure out the correct structure — except that Pauling did not realize at the time that he was in a race! ..3. DNA-predictions

Predictions from the hypothesis

Any useful hypothesis will enable predictions, by reasoning including deductive reasoning. It might predict the outcome of an experiment in a laboratory setting or the observation of a phenomenon in nature. The prediction can also be statistical and only talk about probabilities.

It is essential that the outcome be currently unknown. Only in this case does the eventuation increase the probability that the hypothesis be true. If the outcome is already known, it's called a consequence and should have already been considered while formulating the hypothesis.

If the predictions are not accessible by observation or experience, the hypothesis is not yet useful for the method, and must wait for others who might come afterward, and perhaps rekindle its line of reasoning. For example, a new technology or theory might make the necessary experiments feasible.

DNA-predictions

Image:DNA icon (25x25).png The hypothesis (by James Watson and Francis Crick among others) that DNA had a helical structure implied the prediction that it would produce an x shaped X-ray diffraction pattern. This followed from the work of Cochran, Crick and Vand[13] (and independently by Stokes) who had provided a mathematical basis for the empirical observation that helical structures produce x shapes.

Also in their first paper, Watson and Crick predicted that the double helix structure provided a simple mechanism for DNA replication, writing "It has not escaped our notice that the specific pairing we have postulated immediately suggests a possible copying mechanism for the genetic material". ..4. DNA-experiments

General relativity

Einstein's theory of General Relativity makes several specific predictions about the observable structure of space-time, such as a prediction that light bends in a gravitational field and that the amount of bending depends in a precise way on the strength of that gravitational field. Arthur Eddington's observations made during a 1919 solar eclipse supported General Relativity rather than Newtonian gravitation.

Experiments

Main article: Experiments

Once predictions are made, they can be tested by experiments. If test results contradict predictions, then the hypotheses are called into question and explanations may be sought. Sometimes experiments are conducted incorrectly and are at fault. If the results confirm the predictions, then the hypotheses are considered likely to be correct but might still be wrong and are subject to further testing. The experimental control is a technique for dealing with observational error. This technique uses the contrast between multiple samples (or observations) under differing conditions, to see what varies or what remains the same. We vary the conditions for each measurement, to help isolate what has changed. Mill's canons can then help us figure out what the important factor is. Factor analysis is one technique for discovering the important factor in an effect.

Depending on the predictions, the experiments can have different shapes. It could be a classical experiment in a laboratory setting, a double-blind study or an archaeological excavation. Even taking a plane from New York to Paris is an experiment which tests the aerodynamical hypotheses used for constructing the plane.

Scientists assume an attitude of openness and accountability on the part of those conducting an experiment. Detailed record keeping is essential, to aid in recording and reporting on the experimental results, and providing evidence of the effectiveness and integrity of the procedure. They will also assist in reproducing the experimental results. Traces of this tradition can be seen in the work of Hipparchus (190-120 BCE), when determining a value for the precession of the Earth, while controlled experiments can be seen in the works of Muslim scientists such as Geber (721-815 CE), al-Battani (853–929) and Alhacen (965-1039).

DNA-experiments

Image:DNA icon (25x25).png Watson and Crick showed an initial (and incorrect) proposal for the structure of DNA to a team from Kings College - Rosalind Franklin, Maurice Wilkins, and Raymond Gosling. Franklin immediately spotted the flaws which concerned the water content. Later Watson saw Franklin's detailed X-ray diffraction images which showed an X-shape and confirmed that the structure was helical[14][38]. This rekindled Watson and Crick's model building and led to the correct structure. ..1. DNA-characterizations

Evaluation and iteration

Testing and improvement

The scientific process is iterative. At any stage it is possible that some consideration will lead the scientist to repeat an earlier part of the process. Failure to develop an interesting hypothesis may lead a scientist to re-define the subject they are considering. Failure of a hypothesis to produce interesting and testable predictions may lead to reconsideration of the hypothesis or of the definition of the subject. Failure of the experiment to produce interesting results may lead the scientist to reconsidering the experimental method, the hypothesis or the definition of the subject.

Other scientists may start their own research and enter the process at any stage. They might adopt the characterization and formulate their own hypothesis, or they might adopt the hypothesis and deduce their own predictions. Often the experiment is not done by the person who made the prediction and the characterization is based on experiments done by someone else. Published results of experiments can also serve as a hypothesis predicting their own reproducibility.

DNA-iterations

Image:DNA icon (25x25).png After considerable fruitless experimentation, being discouraged by their superior from continuing, and numerous false starts, Watson and Crick were able to infer the essential structure of DNA by concrete modeling of the physical shapes of the nucleotides which comprise it.[15][39] They were guided by the bond lengths which had been deduced by Linus Pauling and by Rosalind Franklin's X-ray diffraction images. ..DNA Example

Confirmation

Science is a social enterprise, and scientific work tends to be accepted by the community when it has been confirmed. Crucially, experimental and theoretical results must be reproduced by others within the science community. Researchers have given their lives for this vision; Georg Wilhelm Richmann was killed by ball lightning (1753) when attempting to replicate the 1752 kite-flying experiment of Benjamin Franklin.[40]

To protect against bad science and fraudulent data, government research granting agencies like NSF and science journals like Nature and Science have a policy that researchers must archive their data and methods so other researchers can access it, test the data and methods and build on the research that has gone before. Scientific data archiving can be done at a number of national archives in the U.S. or in the World Data Center.

Models of scientific inquiry

Classical model

The classical model of scientific inquiry derives from Aristotle[41], who distinguished the forms of approximate and exact reasoning, set out the threefold scheme of abductive, deductive, and inductive inference, and also treated the compound forms such as reasoning by analogy.

Pragmatic model

Charles Sanders Peirce (1839-1914) considered scientific inquiry to be a species of the genus inquiry, which he defined as any means of fixing belief, that is, any means of arriving at a settled opinion on a matter in question. He observed that inquiry in general begins with a state of uncertainty and moves toward a state of certainty, sufficient at least to terminate the inquiry for the time being. In 1877[42], he outlined four methods for the settlement of doubt, graded by their success in achieving a sound fixation of belief:

  1. The method of tenacity — persisting in that which one is inclined to think.
  2. The method of authority — conformity to a source of ready-made beliefs.
  3. The method of congruity or the a priori or the dilettante or "what is agreeable to reason" — leading to argumentation that gets finally nowhere.
  4. The scientific method.

Peirce held that slow and stumbling ratiocination can be dangerously inferior to instinct, sentiment, and tradition in practical matters, and that the scientific method is best suited to theoretical research,[43] which in turn should not be trammeled by the other methods and practical ends. What recommends the specifically scientific method of inquiry above all others is the fact that it is deliberately designed to arrive, eventually, at the ultimately most secure beliefs, upon which the most successful actions can eventually be based.[44]

In Peirce's view, the conception of inquiry depends on, but also informs, the conceptions of truth and the real; to reason is to presuppose, as a principle of the reasoner's self-regulation, that the truth is discoverable and independent of our vagaries of opinion. As a question of presuppositions of reasoning, he defined truth as the correspondence of a sign (in particular, a proposition) to its object and, pragmatically, not as any actual consensus of any finite community (i.e., such that to inquire would be to go ask the experts for the answers), but instead as that ideal final opinion which all reasonable scientific intelligences would reach despite individual vagaries, sooner or later but still inevitably, if they pushed investigation far enough[45]. In tandem he defined the real as a true sign's object (be that object a possibility or quality, or an actuality or brute fact, or a necessity or norm or law), which is what it is independently of any finite community's opinion and, pragmatically, has dependence only on the ideal final opinion. That is an opinion as far or near as the truth itself to you or me or any finite community of minds. Thus his theory of inquiry boils down to "do the science." At the same time those conceptions of truth and the real involve the idea of a community, both without definite limits and capable of definite increase of knowledge.[46] As inference, "logic is rooted in the social principle".[47]

Paying special attention to the generation of explanations, Peirce outlined scientific method as a collaboration of kinds of inference in a purposeful cycle aimed at settling doubts, as follows[48]:

1. Abduction (or retroduction). Guessing, inference to the best explanation, generation of explanatory hypothesis. From abduction, Peirce distinguishes induction as inferring, on the basis of tests, the proportion of truth in the hypothesis. Every inquiry, whether into ideas, brute facts, or norms and laws, arises as a result of surprising observations in the given realm or realms, and the pondering of the phenomenon in all its aspects in the attempt to resolve the wonder. All explanatory content of theories is reached by way of abduction, the most insecure among modes of inference. Induction as a process is far too slow for that job, so economy of research demands and even governs abduction[49], whose modicum of success depends on one's being somehow attuned to nature, by dispositions learned and, some of them, likely inborn. Abduction has general justification inductively in that it works often enough and that nothing else works, at least not quickly enough when science is already properly rather slow, the work of indefinitely many generations. Peirce calls his pragmatism "the logic of abduction"[50]. His Pragmatic Maxim is: "Consider what effects that might conceivably have practical bearings you conceive the objects of your conception to have. Then, your conception of those effects is the whole of your conception of the object"[45]. His pragmatism is a method of sorting out conceptual confusions by equating the meaning of any concept with the conceivable practical consequences of whatever it is which the concept portrays. It is a method of experimentational mental reflection arriving at conceptions in terms of conceivable confirmatory and disconfirmatory circumstances — a method hospitable to the generation of explanatory hypotheses, and conducive to the employment and improvement of verification to test the truth of putative knowledge. Given abduction's dependence on mental processes not necessarily conscious and deliberate but, in any case, attuned to nature, and given abduction's being driven by the need to economize the inquiry process, its explanatory hypotheses should be optimally simple in the sense of "natural" (for which Peirce cites Galileo and which Peirce distinguishes from "logically simple"). Given abduction's insecurity, it should have consequences with conceivable practical bearing leading at least to mental tests, and, in science, lending themselves to scientific testing.

2. Deduction. Analysis of hypothesis and deduction of its consequences in order to test the hypothesis. Two stages:

i. Explication. Logical analysis of the hypothesis in order to render it as distinct as possible.
ii. Demonstration (or deductive argumentation). Deduction of hypothesis's consequence. Corollarial or, if needed, Theorematic.

3. Induction. The long-run validity of the rule of induction is deducible from the principle (presuppositional to reasoning in general[45]) that the real is only the object of the final opinion to which adequate investigation would lead[51] In other words, if there were something to which an inductive process involving ongoing tests or observations would never lead, then that thing would not be real. Three stages:

i. Classification. Classing objects of experience under general ideas.
ii. Probation (or direct Inductive Argumentation): Crude (the enumeration of instances) or Gradual (new estimate of proportion of truth in the hypothesis after each test). Gradual Induction is Qualitative or Quantitative; if Quantitative, then dependent on measurements, or on statistics, or on countings.
iii. Sentential Induction. "...which, by Inductive reasonings, appraises the different Probations singly, then their combinations, then makes self-appraisal of these very appraisals themselves, and passes final judgment on the whole result"[48].

Computational approaches

Many subspecialties of applied logic and computer science, such as artificial intelligence, machine learning, computational learning theory, inferential statistics, and knowledge representation, are concerned with setting out computational, logical, and statistical frameworks for the various types of inference involved in scientific inquiry. In particular, they contribute hypothesis formation, logical deduction, and empirical testing. Some of these applications draw on measures of complexity from algorithmic information theory to guide the making of predictions from prior distributions of experience, for example, see the complexity measure called the speed prior from which a computable strategy for optimal inductive reasoning can be derived.

Philosophy and sociology of science

Main article: Philosophy of science
Further information: Sociology of science

Philosophy of science looks at the underpinning logic of the scientific method, at what separates science from non-science, and the ethic that is implicit in science. Philosophy is at least implicitly at the core of every decision we make or position we take, it is obvious that correct philosophy is a necessity for scientific inquiry to take place.[52] There are basic assumptions derived from philosophy that form the base of the scientific method - namely, that reality is objective and consistent, that humans have the capacity to perceive reality accurately, and that rational explanations exist for elements of the real world. These assumptions from metaphysical naturalism form the basis on which science is grounded.[52]

We find ourselves in a world that is not directly understandable. We find that we sometimes disagree with others as to the facts of the things we see in the world around us, and we find that there are things in the world that are at odds with our present understanding. The scientific method attempts to provide a way in which we can reach agreement and understanding. A "perfect" scientific method might work in such a way that rational application of the method would always result in agreement and understanding; a perfect method would arguably be algorithmic, and so not leave any room for rational agents to disagree. As with all philosophical topics, the search has been neither straightforward nor simple. Logical Positivist, empiricist, falsificationist, and other theories have claimed to give a definitive account of the logic of science, but each has in turn been criticized.

Thomas Samuel Kuhn examined the history of science in his The Structure of Scientific Revolutions, and found that the actual method used by scientists differed dramatically from the then-espoused method. His observations of science practice are essentially sociological and do not speak to how science is or can be practiced in other times and other cultures.

Imre Lakatos and Thomas Kuhn have done extensive work on the "theory laden" character of observation. Kuhn (1961) said the scientist generally has a theory in mind before designing and undertaking experiments so as to make empirical observations, and that the "route from theory to measurement can almost never be traveled backward". This implies that the way in which theory is tested is dictated by the nature of the theory itself, which led Kuhn (1961, p. 166) to argue that "once it has been adopted by a profession ... no theory is recognized to be testable by any quantitative tests that it has not already passed".

Paul Feyerabend similarly examined the history of science, and was led to deny that science is genuinely a methodological process. In his book Against Method he argues that scientific progress is not the result of applying any particular method. In essence, he says that "anything goes", by which he meant that for any specific methodology or norm of science, successful science has been done in violation of it. Criticisms such as his led to the strong programme, a radical approach to the sociology of science.

In his 1958 book, Personal Knowledge, chemist and philosopher Michael Polanyi (1891-1976) criticized the common view that the scientific method is purely objective and generates objective knowledge. Polanyi cast this view as a misunderstanding of the scientific method and of the nature of scientific inquiry, generally. He argued that scientists do and must follow personal passions in appraising facts and in determining which scientific questions to investigate. He concluded that a structure of liberty is essential for the advancement of science - that the freedom to pursue science for its own sake is a prerequisite for the production of knowledge through peer review and the scientific method.

The postmodernist critiques of science have themselves been the subject of intense controversy. This ongoing debate, known as the science wars, is the result of conflicting values and assumptions between the postmodernist and realist camps. Whereas postmodernists assert that scientific knowledge is simply another discourse (note that this term has special meaning in this context) and not representative of any form of fundamental truth, realists in the scientific community maintain that scientific knowledge does reveal real and fundamental truths about reality. Many books have been written by scientists which take on this problem and challenge the assertions of the postmodernists while defending science as a legitimate method of deriving truth.[53][54][55][56][57]

Communication, community, culture

Frequently the scientific method is not employed by a single person, but by several people cooperating directly or indirectly. Such cooperation can be regarded as one of the defining elements of a scientific community. Various techniques have been developed to ensure the integrity of the scientific method within such an environment.

Peer review evaluation

Scientific journals use a process of peer review, in which scientists' manuscripts are submitted by editors of scientific journals to (usually one to three) fellow (usually anonymous) scientists familiar with the field for evaluation. The referees may or may not recommend publication, publication with suggested modifications, or, sometimes, publication in another journal. This serves to keep the scientific literature free of unscientific or crackpot work, helps to cut down on obvious errors, and generally otherwise improve the quality of the scientific literature.

Documentation and replication

Main article: Reproducibility

Sometimes experimenters may make systematic errors during their experiments, unconsciously veer from the scientific method (Pathological science) for various reasons, or, in rare cases, deliberately falsify their results. Consequently, it is a common practice for other scientists to attempt to repeat the experiments in order to duplicate the results, thus further validating the hypothesis.

Archiving

As a result, researchers are expected to practice scientific data archiving in compliance with the policies of government funding agencies and scientific journals. Detailed records of their experimental procedures, raw data, statistical analyses and source code are preserved in order to provide evidence of the effectiveness and integrity of the procedure and assist in reproduction. These procedural records may also assist in the conception of new experiments to test the hypothesis, and may prove useful to engineers who might examine the potential practical applications of a discovery.

Data sharing

When additional information is needed before a study can be reproduced, the author of the study is expected to provide it promptly - although a small charge may apply. If the author refuses to share data, appeals can be made to the journal editors who published the study or to the institution which funded the research.

Limitations

Note that it is not possible for a scientist to record everything that took place in an experiment. He must select the facts he believes to be relevant to the experiment and report them. This may lead, unavoidably, to problems later if some supposedly irrelevant feature is questioned. For example, Heinrich Hertz did not report the size of the room used to test Maxwell's equations, which later turned out to account for a small deviation in the results. The problem is that parts of the theory itself need to be assumed in order to select and report the experimental conditions. The observations are hence sometimes described as being 'theory-laden'.

Dimensions of practice

Further information: Rhetoric of science

The primary constraints on contemporary western science are:

  • Publication, i.e. Peer review
  • Resources (mostly funding)

It has not always been like this: in the old days of the "gentleman scientist" funding (and to a lesser extent publication) were far weaker constraints.

Both of these constraints indirectly bring in a scientific method — work that too obviously violates the constraints will be difficult to publish and difficult to get funded. Journals do not require submitted papers to conform to anything more specific than "good scientific practice" and this is mostly enforced by peer review. Originality, importance and interest are more important - see for example the author guidelines for Nature.

Criticisms (see Critical theory) of these restraints are that they are so nebulous in definition (e.g. "good scientific practice") and open to ideological, or even political, manipulation apart from a rigorous practice of a scientific method, that they often serve to censor rather than promote scientific discovery.[citation needed] Apparent censorship through refusal to publish ideas unpopular with mainstream scientists (unpopular because of ideological reasons and/or because they seem to contradict long held scientific theories) has soured the popular perception of scientists as being neutral or seekers of truth and often denigrated popular perception of science as a whole.

History

The development of the scientific method is inseparable from the history of science itself. Ancient Egyptian documents, such as early papyri, describe methods of medical diagnosis. In ancient Greek culture, the method of empiricism was described. The first experimental scientific method was developed by Muslim scientists, who introduced the use of experimentation and quantification to distinguish between competing scientific theories set within a generally empirical orientation, which emerged with Alhazen's optical experiments in his Book of Optics (1021).[58][59] The modern scientific method crystallized no later than in the 17th and 18th centuries. In his work Novum Organum (1620) — a reference to Aristotle's OrganonFrancis Bacon outlined a new system of logic to improve upon the old philosophical process of syllogism. Then, in 1637, René Descartes established the framework for a scientific method's guiding principles in his treatise, Discourse on Method. The writings of Alhazen, Bacon and Descartes are considered critical in the historical development of the modern scientific method.

In the late 19th century, Charles Sanders Peirce proposed a schema that would turn out to have considerable influence in the development of current scientific method generally. Peirce accelerated the progress on several fronts. Firstly, speaking in broader context in "How to Make Our Ideas Clear" (1878), Peirce outlined an objectively verifiable method to test the truth of putative knowledge on a way that goes beyond mere foundational alternatives, focusing upon both deduction and induction. He thus placed induction and deduction in a complementary rather than competitive context (the latter of which had been the primary trend at least since David Hume, who wrote in the mid-to-late 18th century). Secondly, and of more direct importance to modern method, Peirce put forth the basic schema for hypothesis/testing that continues to prevail today. Extracting the theory of inquiry from its raw materials in classical logic, he refined it in parallel with the early development of symbolic logic to address the then-current problems in scientific reasoning. Peirce examined and articulated the three fundamental modes of reasoning that, as discussed above in this article, play a role in inquiry today, the processes that are currently known as abductive, deductive, and inductive inference. Thirdly, he played a major role in the progress of symbolic logic itself — indeed this was his primary specialty.

Karl Popper denied the existence of evidence[60] and of scientific method.[61] Popper holds that there is only one universal method, the negative method of trial and error. It covers not only all products of the human mind, including science, mathematics, philosophy, art and so on, but also the evolution of life. Beginning in the 1930s and with increased vigor after World War II, he argued that a hypothesis must be falsifiable and, following Peirce and others, that science would best progress using deductive reasoning as its primary emphasis, known as critical rationalism. [62] His formulations of logical procedure helped to rein in excessive use of inductive speculation upon inductive speculation, and also strengthened the conceptual foundation for today's peer review procedures.

Relationship with mathematics

Science is the process of gathering, comparing, and evaluating proposed models against observables. A model can be a simulation, mathematical or chemical formula, or set of proposed steps. Science is like mathematics in that researchers in both disciplines can clearly distinguish what is known from what is unknown at each stage of discovery. Models, in both science and mathematics, need to be internally consistent and also ought to be falsifiable (capable of disproof). In mathematics, a statement need not yet be proven; at such a stage, that statement would be called a conjecture. But when a statement has attained mathematical proof, that statement gains a kind of immortality which is highly prized by mathematicians, and for which some mathematicians devote their lives[63].

Mathematical work and scientific work can inspire each other[64]. For example, the technical concept of time arose in science, and timelessness was a hallmark of a mathematical topic. But today, the Poincaré conjecture has been proven using time as a mathematical concept in which objects can flow (see Ricci flow).

Nevertheless, the connection between mathematics and reality (and so science to the extent it describes reality) remains obscure. Eugene Wigner's paper, The Unreasonable Effectiveness of Mathematics in the Natural Sciences, is a very well-known account of the issue from a Nobel Prize physicist. In fact, some observers (including some well known mathematicians such as Gregory Chaitin, and others such as Lakoff and Nunez) have suggested that mathematics is the result of practitioner bias and human limitation (including cultural ones), somewhat like the post-modernist view of science.

George Pólya's work on problem solving[65], the construction of mathematical proofs, and heuristic[66][67] show that mathematical method and scientific method differ in detail, while nevertheless resembling each other in using iterative or recursive steps.


Mathematical method Scientific method
1 Understanding Characterization from experience and observation
2 Analysis Hypothesis: a proposed explanation
3 Synthesis Deduction: prediction from the hypothesis
4 Review/Extend Test and experiment

In Pólya's view, understanding involves restating unfamiliar definitions in your own words, resorting to geometrical figures, and questioning what we know and do not know already; analysis, which Pólya takes from Pappus[68], involves free and heuristic construction of plausible arguments, working backward from the goal, and devising a plan for constructing the proof; synthesis is the strict Euclidean exposition of step-by-step details[69] of the proof; review involves reconsidering and re-examining the result and the path taken to it.

See also

Synopsis of related topics

Logic, mathematics, methodology

Problems and issues

History, philosophy, sociology

Notes and references

  1. ^ "[4] Rules for the study of natural philosophy", Newton 1999, pp. 794-6, from the General Scholium, which follows Book 3, The System of the World.
  2. ^ scientific method, Merriam-Webster Dictionary.
  3. ^ Alhazen (Ibn Al-Haytham) Critique of Ptolemy, translated by S. Pines, Actes X Congrès internationale d'histoire des sciences, Vol I Ithaca 1962, as referenced in Sambursky 1974, p. 139
  4. ^ Alhazen, translated into English from German by M. Schwarz, from "Abhandlung über das Licht", J. Baarmann (ed. 1882) Zeitschrift der Deutschen Morgenländischen Gesellschaft Vol 36 as referenced in Sambursky 1974, p. 136
  5. ^ as quoted in Sambursky 1974, p. 136
  6. ^ "...the statement of a law—A depends on B—always transcends experience." —Born 1949, p. 6
  7. ^ "I believe that we do not know anything for certain, but everything probably." —Christiaan Huygens, Letter to Pierre Perrault, 'Sur la préface de M. Perrault de son traité del'Origine des fontaines' [1763], Oeuvres Complétes de Christiaan Huygens (1897), Vol. 7, 298. Quoted in Jacques Roger, The Life Sciences in Eighteenth-Century French Thought, ed. Keith R. Benson and trans. Robert Ellrich (1997), 163. Quotation selected by Bynum & Porter 2005, p. 317 Huygens 317#4.
  8. ^ As noted by Alice Calaprice (ed. 2005) The New Quotable Einstein Princeton University Press and Hebrew University of Jerusalem, ISBN 0-691-12074-9 p. 291. Calaprice denotes this not as an exact quotation, but as a paraphrase of a translation of A. Einstein's "Induction and Deduction". Collected Papers of Albert Einstein 7 Document 28. Volume 7 is The Berlin Years: Writings, 1918-1921. A. Einstein; M. Janssen, R. Schulmann, et al., eds.
  9. ^ http://www.iep.utm.edu/h/hempel.htm
  10. ^ Fleck 1975, pp. xxvii-xxviii
  11. ^ October, 1951. as noted in McElheny 2004, p. 40.
  12. ^ June, 1952. as noted in McElheny 2004, p. 43.
  13. ^ a b Cochran W, Crick FHC and Vand V. (1952) "The Structure of Synthetic Polypeptides. I. The Transform of Atoms on a Helix", Acta Cryst., 5, 581-586.
  14. ^ a b Friday, January 30, 1953. Tea time. as noted in McElheny 2004, p. 52.
  15. ^ a b Saturday, February 28, 1953, as noted in McElheny 2004, pp. 57-59.
  16. ^ "Observation and experiment are subject to a very popular myth. ... The knower is seen as a ... Julius Caesar winning his battles according to ... formula. Even research workers will admit that the first observation may have been a little imprecise, whereas the second and third were 'adjusted to the facts' ... until tradition, education, and familiarity have produced a readiness for stylized (that is directed and restricted) perception and action; until an answer becomes largely pre-formed in the question, and a decision confined merely to 'yes' or 'no' or perhaps to a numerical determination; until methods and apparatus automatically carry out the greatest part of the mental work for us." Fleck labels this thought style (Denkstil). Fleck 1975, p. 84.
  17. ^ See the hypothethico-deductive method, for example, Godfrey-Smith 2003, p. 236.
  18. ^ Jevons 1874, pp. 265-6.
  19. ^ pp.65,73,92,398 —Andrew J. Galambos, Sic Itur ad Astra ISBN 0-88078-004-5(AJG learned scientific method from Felix Ehrenhaft)
  20. ^ Galilei 1638, pp. v-xii,1-300
  21. ^ Brody 1993, pp. 10-24 calls this the "epistemic cycle": "The epistemic cycle starts from an initial model; iterations of the cycle then improve the model until an adequate fit is achieved.".
  22. ^ Iteration example: Chaldean astronomers such as Kidinnu compiled astronomical data. Hipparchus was to use this data to calculate the precession of the Earth's axis. Fifteen hundred years after Kidinnu, Al-Batani, born in what is now Turkey, would use the collected data and improve Hipparchus' value for the precession of the Earth's axis. Al-Batani's value, 54.5 arc-seconds per year, compares well to the current value of 49.8 arc-seconds per year (26,000 years for Earth's axis to round the circle of nutation).
  23. ^ Recursion example: the Earth is itself a magnet, with its own North and South Poles William Gilbert (in Latin 1600) De Magnete, or On Magnetism and Magnetic Bodies. Translated from Latin to English, selection by Moulton & Schifferes 1960, pp. 113-117.
  24. ^ "The foundation of general physics ... is experience. These ... everyday experiences we do not discover without deliberately directing our attention to them. Collecting information about these is observation." —Hans Christian Ørsted("First Introduction to General Physics" ¶13, part of a series of public lectures at the University of Copenhagen. Copenhagen 1811, in Danish, printed by Johan Frederik Schulz. In Kirstine Meyer's 1920 edition of Ørsted's works, vol.III pp. 151-190. ) "First Introduction to Physics: the Spirit, Meaning, and Goal of Natural Science". Reprinted in German in 1822, Schweigger's Journal für Chemie und Physik 36, pp.458-488, as translated in Ørsted 1997, p. 292
  25. ^ "When it is not clear under which law of nature an effect or class of effect belongs, we try to fill this gap by means of a guess. Such guesses have been given the name conjectures or hypotheses." —Hans Christian Ørsted(1811) "First Introduction to General Physics" as translated in Ørsted 1997, p. 297.
  26. ^ "In general we look for a new law by the following process. First we guess it. ...", —Feynman 1965, p. 156
  27. ^ "... the statement of a law - A depends on B - always transcends experience." —Born 1949, p. 6
  28. ^ "The student of nature ... regards as his property the experiences which the mathematician can only borrow. This is why he deduces theorems directly from the nature of an effect while the mathematician only arrives at them circuitously." —Hans Christian Ørsted(1811) "First Introduction to General Physics" ¶17. as translated in Ørsted 1997, p. 297.
  29. ^ Salviati speaks: "I greatly doubt that Aristotle ever tested by experiment whether it be true that two stones, one weighing ten times as much as the other, if allowed to fall, at the same instant, from a height of, say, 100 cubits, would so differ in speed that when the heavier had reached the ground, the other would not have fallen more than 10 cubits." Two New Sciences (1638)Galilei 1638, pp. 61-62. A more extended quotation is referenced by Moulton & Schifferes 1960, pp. 80-81.
  30. ^ In the inquiry-based education paradigm, the stage of "characterization, observation, definition, …" is more briefly summed up under the rubric of a Question.
  31. ^ "To raise new questions, new possibilities, to regard old problems from a new angle, requires creative imagination and marks real advance in science." —Einstein & Infeld 1938, p. 92.
  32. ^ Crawford S, Stucki L (1990), "Peer review and the changing research record", "J Am Soc Info Science", vol. 41, pp 223-228
  33. ^ See, e.g., Gauch, Hugh G., Jr., Scientific Method in Practice (2003), esp. chapters 5-8
  34. ^ Cartwright, Nancy (1983), How the Laws of Physics Lie. Oxford: Oxford University Press. ISBN 0198247044
  35. ^ Andreas Vesalius, Epistola, Rationem, Modumque Propinandi Radicis Chynae Decocti (1546), 141. Quoted and translated in C.D. O'Malley, Andreas Vesalius of Brussels, (1964), 116. As quoted by Bynum & Porter 2005, p. 597: Andreas Vesalius,597#1.
  36. ^ Crick, Francis (1994), The Astonishing Hypothesis ISBN 0-684-19431-7 p.20
  37. ^ Glen 1994, pp. 37-38.
  38. ^ "The instant I saw the picture my mouth fell open and my pulse began to race." —Watson 1968, p. 167. Page 168 shows the X-shaped pattern of the B-form of DNA, clearly indicating crucial details of its helical structure to Watson and Crick.
  39. ^ "Suddenly I became aware that an adenine-thymine pair held together by two hydrogen bonds was identical in shape to a guanine-cytosine pair held together by at least two hydrogen bonds. ..." —Watson 1968, pp. 194-197.
  40. ^ See, e.g., Physics Today, 59(1), p42. Richmann electrocuted in St. Petersburg (1753)
  41. ^ Aristotle, "Prior Analytics", Hugh Tredennick (trans.), pp. 181-531 in Aristotle, Volume 1, Loeb Classical Library, William Heinemann, London, UK, 1938.
  42. ^ Peirce, C.S. (1877), "The Fixation of Belief", Popular Science Monthly, v. 12, 1–15. Reprinted often, including (Collected Papers of Charles Sanders Peirce 5.358–387), (The Essential Peirce, v. 1, 109–123). Eprint. Internet Archive Popular Science Monthly 12.
  43. ^ Peirce, C.S. (1898), "Philosophy and the Conduct of Life", Lecture 1 of the Cambridge (MA) Conferences Lectures, published in Collected Papers of Charles Sanders Peirce 1.616-48 in part and in Reasoning and the Logic of Things, Ketner (ed., intro.) and Putnam (intro., comm.), 105-22, reprinted in The Essential Peirce v. 2, 27-41.
  44. ^ Peirce, C.S. (1903), "Lectures on Pragmatism", published in part (Collected Papers of Charles Sanders Peirce 5.14–212), Eprint. Fully published (Patricia Ann Turisi (ed.), Pragmatism as a Principle and Method of Right Thinking: The 1903 Harvard "Lectures on Pragmatism", State University of New York Press, Albany, NY, 1997), (The Essential Peirce, v. 2, 133–241).
  45. ^ a b c Peirce, C.S. (1877), "How to Make Our Ideas Clear", Popular Science Monthly, v. 12, pp. 286–302, Internet Archive PSM 12. Reprinted often, including (Collected Papers of Charles Sanders Peirce 5.388–410), (The Essential Peirce v. 1, 124–141). Arisbe Eprint.
  46. ^ Peirce, C. S. (1868), "Some Consequences of Four Incapacities", Journal of Speculative Philosophy v. 2, no. 3, pp. 140–157. Reprinted often, including (Collected Papers of Charles Sanders Peirce 5.264–317), (The Essential Peirce v. 1, 28–55). Arisbe Eprint
  47. ^ Peirce, C.S. (1878), "The Doctrine of Chances", Popular Science Monthly v. 12, pp. 604-615, Internet Archive PSM 12. Reprinted (Collected Papers of Charles Sanders Peirce 2.645-668), (The Essential Peirce v. 1, 142-154). "...death makes the number of our risks, the number of our inferences, finite, and so makes their mean result uncertain. The very idea of probability and of reasoning rests on the assumption that this number is indefinitely great. .... ...logicality inexorably requires that our interests shall not be limited. .... Logic is rooted in the social principle."
  48. ^ a b Peirce, C.S. (1908), "A Neglected Argument for the Reality of God", Hibbert Journal v. 7, 90-112. Reprinted often, including (Collected Papers of Charles Sanders Peirce 6.452-485), (The Essential Peirce v. 2, 434-450). Internet Archive Hibbert Journal 7.
  49. ^ Peirce, C.S. (1902), application to the Carnegie Institution, see MS L75.329-330, from Draft D of Memoir 27:

    Consequently, to discover is simply to expedite an event that would occur sooner or later, if we had not troubled ourselves to make the discovery. Consequently, the art of discovery is purely a question of economics. The economics of research is, so far as logic is concerned, the leading doctrine with reference to the art of discovery. Consequently, the conduct of abduction, which is chiefly a question of heuretic and is the first question of heuretic, is to be governed by economical considerations.

  50. ^ Peirce, C.S. (1903), "Pragmatism — The Logic of Abduction", Collected Papers of Charles Sanders Peirce 5.195-205, especially para. 196. Eprint.
  51. ^ Peirce, C.S., (1878) "The Probability of Induction", Popular Science Monthly, vol. 12, pp. 705-718, Internet Archive PSM 12. Reprinted often, including (Collected Papers of Charles Sanders Peirce 2.669-693), (The Essential Peirce v. 1, 155-169).
  52. ^ a b A., Kate; Sergei, Vitaly (2000). "Evolution and Philosophy: Science and Philosophy". Think Quest. http://library.thinkquest.org/C004367/ph1.shtml. Retrieved on 19 January 2009. 
  53. ^ Higher Superstition: The Academic Left and Its Quarrels with Science, The Johns Hopkins University Press, 1997
  54. ^ Fashionable Nonsense: Postmodern Intellectuals' Abuse of Science, Picador; 1st Picador USA Pbk. Ed edition, 1999
  55. ^ The Sokal Hoax: The Sham That Shook the Academy, University of Nebraska Press, 2000 ISBN 0803279957
  56. ^ A House Built on Sand: Exposing Postmodernist Myths About Science, Oxford University Press, 2000
  57. ^ Intellectual Impostures, Economist Books, 2003
  58. ^ Rosanna Gorini (2003), "Al-Haytham the Man of Experience, First Steps in the Science of Vision", International Society for the History of Islamic Medicine, Institute of Neurosciences, Laboratory of Psychobiology and Psychopharmacology, Rome, Italy:

    "According to the majority of the historians al-Haytham was the pioneer of the modern scientific method. With his book he changed the meaning of the term optics and established experiments as the norm of proof in the field. His investigations are based not on abstract theories, but on experimental evidences and his experiments were systematic and repeatable."

  59. ^ David Agar (2001). Arabic Studies in Physics and Astronomy During 800 - 1400 AD. University of Jyväskylä.
  60. ^ Logik der Forschung, new appendix *XIX (not yet available in the English edition Logic of scientific discovery)
  61. ^ Karl Popper: On the non-existence of scientific method. Realism and the Aim of Science (1983)
  62. ^ Karl Popper: Objective Knowledge (1972)
  63. ^ "When we are working intensively, we feel keenly the progress of our work; we are elated when our progress is rapid, we are depressed when it is slow." — the mathematician Pólya 1957, p. 131 in the section on 'Modern heuristic'.
  64. ^ "Philosophy [i.e., physics] is written in this grand book--I mean the universe--which stands continually open to our gaze, but it cannot be understood unless one first learns to comprehend the language and interpret the characters in which it is written. It is written in the language of mathematics, and its characters are triangles, circles, and other geometrical figures, without which it is humanly impossible to understand a single word of it; without these, one is wandering around in a dark labyrinth." —Galileo Galilei, Il Saggiatore (The Assayer, 1623), as translated by Stillman Drake (1957), Discoveries and Opinions of Galileo pp. 237-8, as quoted by di Francia 1981, p. 10.
  65. ^ Pólya 1957 2nd ed.
  66. ^ George Pólya (1954), Mathematics and Plausible Reasoning Volume I: Induction and Analogy in Mathematics,
  67. ^ George Pólya (1954), Mathematics and Plausible Reasoning Volume II: Patterns of Plausible Reasoning.
  68. ^ Pólya 1957, p. 142
  69. ^ Pólya 1957, p. 144

Further reading

  • Bacon, Francis Novum Organum (The New Organon), 1620. Bacon's work described many of the accepted principles, underscoring the importance of theory, empirical results, data gathering, experiment, and independent corroboration.
  • Bauer, Henry H., Scientific Literacy and the Myth of the Scientific Method, University of Illinois Press, Champaign, IL, 1992
  • Brody, Thomas A. (1993), The Philosophy Behind Physics, Springer Verlag, ISBN 0-387-55914-0 . (Luis De La Pena and Peter E. Hodgson, eds.)
  • Burks, Arthur W., Chance, Cause, Reason — An Inquiry into the Nature of Scientific Evidence, University of Chicago Press, Chicago, IL, 1977.
  • Bynum, W.F.; Porter, Roy (2005), Oxford Dictionary of Scientific Quotations, Oxford, ISBN 0-19-858409-1 .
  • Chomsky, Noam, Reflections on Language, Pantheon Books, New York, NY, 1975.
  • Dewey, John, How We Think, D.C. Heath, Lexington, MA, 1910. Reprinted, Prometheus Books, Buffalo, NY, 1991.
  • di Francia, G. Toraldo (1981), The Investigation of the Physical World, Cambridge University Press, ISBN 0-521-29925-X .
  • Earman, John (ed.), Inference, Explanation, and Other Frustrations: Essays in the Philosophy of Science, University of California Press, Berkeley & Los Angeles, CA, 1992.
  • Einstein, Albert; Infeld, Leopold (1938), The Evolution of Physics: from early concepts to relativity and quanta, New York: Simon and Schuster, ISBN 0-671-20156-5 
  • Fraassen, Bas C. van, The Scientific Image, Oxford University Press, Oxford, UK, 1980.
  • Heisenberg, Werner, Physics and Beyond, Encounters and Conversations, A.J. Pomerans (trans.), Harper and Row, New York, NY 1971, pp. 63–64.
  • Holton, Gerald, Thematic Origins of Scientific Thought, Kepler to Einstein, 1st edition 1973, revised edition, Harvard University Press, Cambridge, MA, 1988.
  • Kuhn, Thomas S., "The Function of Measurement in Modern Physical Science", ISIS 52(2), 161–193, 1961.
  • Kuhn, Thomas S., The Structure of Scientific Revolutions, University of Chicago Press, Chicago, IL, 1962. 2nd edition 1970. 3rd edition 1996.
  • Kuhn, Thomas S., The Essential Tension, Selected Studies in Scientific Tradition and Change, University of Chicago Press, Chicago, IL, 1977.
  • Latour, Bruno, Science in Action, How to Follow Scientists and Engineers through Society, Harvard University Press, Cambridge, MA, 1987.
  • Losee, John, A Historical Introduction to the Philosophy of Science, Oxford University Press, Oxford, UK, 1972. 2nd edition, 1980.
  • Maxwell, Nicholas, The Comprehensibility of the Universe: A New Conception of Science, Oxford University Press, Oxford, 1998. Paperback 2003.
  • Misak, Cheryl J., Truth and the End of Inquiry, A Peircean Account of Truth, Oxford University Press, Oxford, UK, 1991.
  • Moulton, Forest Ray; Schifferes, Justus J. (eds., Second Edition) (1960), The Autobiography of Science, Doubleday .
  • Newell, Allen, Unified Theories of Cognition, Harvard University Press, Cambridge, MA, 1990.
  • Newton, Isaac (1687, 1713, 1726), Philosophiae Naturalis Principia Mathematica, University of California Press, ISBN 0-520-08817-4 , Third edition. From I. Bernard Cohen and Anne Whitman's 1999 translation, 974 pages.
  • Ørsted, Hans Christian (1997), Selected Scientific Works of Hans Christian Ørsted, Princeton, ISBN 0-691-04334-5 . Translated to English by Karen Jelved, Andrew D. Jackson, and Ole Knudsen, (translators 1997).
  • Peirce, C.S. (1957), Essays in the Philosophy of Science, New York, NY: Bobbs–Merrill , Vincent Tomas (ed.).
  • Peirce, C.S. (1931-1935, 1958), Collected Papers, Cambridge, MA: Harvard University Press , Cited as CP vol.para. vols. 1-6, Charles Hartshorne and Paul Weiss (eds.), vols. 7-8, Arthur W. Burks (ed.).
  • Peirce, C.S. (1981), Writings of Charles S. Peirce, A Chronological Edition, Bloomington, IN: Indiana University Press , Cited as W vol.para., Peirce Edition Project (eds.), Vol 1.(1857-66), Vol. 2(1867-71), Vol. 3(1872-78), Vol 4.(1879-85), Vol. 5(1884-85), Vol. 6(1886-90).
  • Peirce, C.S. (1998), The Essential Peirce, Selected Philosophical Writings, Bloomington, IN: Indiana University Press , Peirce Edition Project (eds.), Volume 1 (1867–1893) is ISBN 0-253-32849-7, Volume 2 (1893–1913) is ISBN 0-253-33397-0
  • Piattelli-Palmarini, Massimo (ed.), Language and Learning, The Debate between Jean Piaget and Noam Chomsky, Harvard University Press, Cambridge, MA, 1980.
  • Popper, Karl R., Unended Quest, An Intellectual Autobiography, Open Court, La Salle, IL, 1982.
  • Putnam, Hilary, Renewing Philosophy, Harvard University Press, Cambridge, MA, 1992.
  • Rorty, Richard, Philosophy and the Mirror of Nature, Princeton University Press, Princeton, NJ, 1979.
  • Salmon, Wesley C., Four Decades of Scientific Explanation, University of Minnesota Press, Minneapolis, MN, 1990.
  • Sambursky, Shmuel (ed.) (1974), Physical Thought from the Presocratics to the Quantum Physicists, Pica Press, ISBN 0-87663-712-8 .
  • Shimony, Abner, Search for a Naturalistic World View: Vol. 1, Scientific Method and Epistemology, Vol. 2, Natural Science and Metaphysics, Cambridge University Press, Cambridge, UK, 1993.
  • Thagard, Paul, Conceptual Revolutions, Princeton University Press, Princeton, NJ, 1992.
  • Watson, James D. (1968), The Double Helix, New York: Atheneum, Library of Congress card number 68-16217 .
  • Ziman, John (2000). Real Science: what it is, and what it means. Cambridge, UK: Cambridge University Press.

External links

Sister project Wikibooks has a book on the topic of