Thinking scientifically

Home Forums Potpourri Thinking scientifically

This topic contains 1 reply, has 2 voices, and was last updated by  hallenrm 2 weeks, 1 day ago.

Viewing 2 posts - 1 through 2 (of 2 total)
  • Author
    Posts
  • #175

    AdminCSEC
    Keymaster

    Yesterday was an historic day for the Indian cricket, for the first time many Indians are feeling on the top of the world, because the Indian team has succeeded in defeating the master team, the Aussies, on their own soil. Great deed indeed.

    I hope we shall achieve many such deed in future, not only in cricket. Our country needs fresh thinking about our national priorities, the older generation of scientists has achieved what they could, its high time the younger generation starts thinking.

    Here’s a link that provides the full text of an article published in Scientific American India, September 2005 issue. The article is entitled How Should we set priorities? authored by W. Wayt Gibbs.

    Here’s the opening paragraph of the article

    W Wayt Gibbs wrote:
    How should humanity progress in the next two generations? Which challenges should we engage, in what order, and with how much sacrifice (if any) of comfort and liberty? There are probably as many distinct responses to these questions as there are thoughtful people on the planet. Not all answers are equally wise, of course, but none can be definitive either. These are ultimately questions about one’s moral values and personal preferences.

    Experts can help us to understand which problems are most threatening, which solutions are most promising, and how costly it might be to act or to wait. Scientists can exhort us–just as the other contributors to this special issue urge us to focus on ending extreme poverty, securing biodiversity “hot spots,” improving agricultural infrastructure, boosting the efficiency of our energy use, and reining in epidemic diseases–yet the experts cannot directly steer the course of humanity.

    I think at least a few members of DU science community would be interested in a discussion on its contents

    #289

    hallenrm
    Participant

    Here’s an interesting addendum to my earlier post:

    C.H. Llewellyn Smith wrote:

    The use of basic science: What science to fund

    I have argued that economic, as well as cultural, considerations lead to the conclusion that public funding should be primarily directed to basic, rather than applied, science. If however we appeal to economic arguments in this way, we cannot object to their use in discussions of the partition of funding between different areas of basic science. The problem is that “both forecasting and innovation are highly stochastic processes, so that the probability of correctly forecasting an innovation, being the product of two low probabilities, is, in theory, close to zero.”

    If Rutherford, who discovered the nucleus, could not foresee nuclear power, could a government committee do better? Who could have foreseen warm superconductors, fullerenes, or the World Wide Web? Earlier I suggested that Faraday might have foreseen the applications of electricity but in 1867, nine years after Faraday’s death, a meeting of British scientists pronounced that “Although we cannot say what remains to be invented, we can say that there seems to be no reason to believe that electricity will be used as a practical mode of power”. In a similar vein, it is well known that Thomas Watson, the creator of IBM, said in 1947 that a single computer “could solve all the important scientific problems of the world involving scientific calculations” but that he did not foresee other uses for computers.

    This unpredictability, which I have argued is one reason that it is up to governments to fund basic science in the first place, also means that in practice it is probably impossible, and very possibly dangerous, to try to distribute funding for basic science on the basis of perceived economic utility. The traditional criteria of scientific excellence, and the excellence of the people involved, are probably as good as any, and in my opinion these are the criteria that should continue to be used � after all money is more abundant than brains even in this cost-conscious era.

    The fact that results of basic research are unpredictable does not mean that economic incentives to find solutions to specific applied problems are futile. 19th century scientists sought methods for artificial fixation of nitrogen, but failed until the First World War deprived Germany of fertilisers, where upon a solution was quickly found. US science, technology and money met the political imperative to put a man on the moon before 1970. But it is important to understand when such incentives are likely to be effective and when they are not. President Nixon launched a battle against cancer, modelled explicitly on the success of the space programme, but it failed. The reason is clear enough. The physical principles involved in putting men on the moon were well understood before the space programme began, while our knowledge of the biological principles underlying the growth and mutation of cells is still limited.

    This brings me to the funding of applied research. I have argued that, generally, governments should keep “away from the market”, and fund areas that are ‘public goods’ because the returns are long-term, or not commercial, e.g. research on the environment or traffic control. Near market work can and should be left mainly to industry, which agrees according to J. Baruch on whose recent article (ref. 1Cool the following paragraph is based.

    Big companies such as 3M, IBM, Siemens, Ford, etc. want to innovate with current technologies that can be priced and predicted accurately, and do not want the help of academics which would only force them to share the profits. Nor are academics generally interested in such collaboration. The exceptions are academics wanting to innovate with available technologies into order to develop new instruments for their research (a category which includes particle physicists). Here there is a considerable mutual benefit and a considerable synergy between technological innovation for profit and technological innovation for research. Indeed, according to Baruch “The people who have most to offer [to industry] are the dedicated research scientists, not the academic technologists or engineers, who do not wish to be distracted from their research in order to help solve common place technological problems”……………………….more……

    The author is a former Director-General of CERN

Viewing 2 posts - 1 through 2 (of 2 total)

You must be logged in to reply to this topic.