AstraZeneca neither confirms nor denies that it will ditch antibiotics research

16881_lores

A computer image of a cluster of drug-resistant Mycobacterium tuberculosis.

US Centers for Disease Control and Prevention/ Melissa Brower

The fight against antibiotic-resistant microbes would suffer a major blow if widely circulated rumours were confirmed that pharmaceutical giant AstraZeneca plans to disband its in-house antibiotic development. The company called the rumours “highly speculative” while not explicitly denying them.

On 23 October, drug-industry consultant David Shlaes wrote on his blog that AstraZeneca, a multinational behemoth headquartered in London, “has told its antibiotics researchers that they should make efforts to find other jobs in the near future”, and that in his opinion this heralds the end of in-house antibiotic development at the company. “As far as antibiotic discovery and development goes, this has to be the most disappointing news of the entire antibiotic era,” wrote Shlaes.

AstraZeneca would not directly address these claims when approached by Nature for comment. In its statement it said, in full:

The blog is highly speculative. We continue to be active in anti-infectives and have a strong pipeline of drugs in development. However, we have previously said on a number of occasions that as we focus on our core therapy areas (Oncology, CVMD [cardiovascular and metabolic diseases] and Respiratory, Inflammation and Autoimmune) we will continue to remain opportunity driven in infection and neuroscience, in particular exploring partnering opportunities to maximise the value of our pipeline and portfolio.

Research into antibiotics is notorious for its high cost and high failure rate. AstraZeneca has previously said that its main research focus would be on areas other than antibiotic development.

Public-health experts have been warning about a trend among large pharmaceutical companies to move away from antibiotics research — just as the World Health Organization and others have pointed to the rising threat of deadly multi-drug-resistant strains of bacteria such as Mycobacterium tuberculosis or Staphylococcus aureus (see ‘Antibiotic resistance: The last resort‘).

More than half of 2007-2012 research articles now free to read

More than half of all peer-reviewed research articles published from 2007 to 2012 are now free to download somewhere on the Internet, according to a report produced for the European Commission, published today. That is a step up from the situation last year, when only one year  – 2011 – reached the 50% free mark. But the report also underlines how availability dips in the most recent year, because many papers are only made free after a delay.

nature_chart_open access_30 10 14

“A substantial part of the material openly available is relatively old, or as some would say, outdated,” writes Science-Metrix, a consultancy in Montreal, Canada, who conducted the study, one of a series of reports on open access policies and open data.

The study (which has not been formally peer-reviewed) forms part of the European Commission’s efforts to track the evolution of open access. Science-Metrix uses automated software to search online for hundreds of thousands of papers from the Scopus database.

The company finds that the proportion of new papers published directly in open-access journals reached almost 13% in 2012. The bulk of the Internet’s free papers are available through other means – made open by publishers after a delay, or by authors archiving their manuscripts online. But their proportion of the total seems to have stuck at around 40% for the past few years. That apparent lack of impetus is partly because of a ‘backfilling’ effect, whereby the past is made to look more open as authors upload versions of older paywalled papers into online repositories, the report says. During this last year, for instance, close to 14,000 papers originally published in 1996 were made available for free.

“The fundamental problem highlighted by the Science-Metrix findings is timing,” writes Stevan Harnad, an open-access advocate and cognitive scientist at the University of Quebec in Montreal, Canada. “Over 50% of all articles published between 2007 and 2012 are freely available today. But the trouble is that their percentage in the most critical years, namely, the 1-2 years following publication, is far lower than that. This is partly because of publisher open access embargoes, partly because of author fears and sluggishness, but mostly because not enough strong, effective open access mandates have as yet been adopted by institutions and funders.”

The report’s conclusions are only estimates, as the automated software does not pick up every free paper, and this incompleteness must be adjusted for in the figures (typically adding around 5-6% to the total, a margin calculated by testing the software on a smaller, hand-checked sample of papers). And many of the articles, although free to read, do not meet formal definitions of open access – for example, they do not include details on whether readers can freely reuse the material. Éric Archambault, the founder and president of Science-Metrix, says it is still hard to track different kinds of open manuscripts, and when they became free to read.

The proportion of free papers also differs by country and by subject. Biomedical research (71% estimated free between 2011 and 2013) is far more open than chemistry (39%), for example. The study suggests that from 2008-2013, the world’s average was 54%, with Brazil (76%) and the Netherlands (74%) particularly high. The United Kingdom, where the nation’s main public funder, Research Councils UK, has set a 45% target for 2013-14, has already reached 64% in previous years, the report suggests.

The study comes during Open Access week, which is seeing events around the world promoting the ideas of open access to research. Yesterday saw the launch of the ‘Open Access Button’ in London – a website and app that allows users to find free research. If no free copy is available, the app promises to email authors asking them to upload a free version of their paper – with an explanation direct from the user who needs the manuscript. “We are trying to make open access personal – setting up a conversation between the author and the person who wants access,” says Joe McArthur, who co-founded the project and works at the Right to Research Coalition, an advocacy group in London.

Outbreak of great quakes underscores Cascadia risk

Posted on behalf of Alexandra Witze.

The 18 great earthquakes that have struck Earth in the past decade hold ominous lessons for western North America, a top seismologist has warned. Many of these large quakes — including the 2004 Sumatra quake that spawned the Indian Ocean tsunami, and the 2011 Tohoku disaster in Japan — were surprisingly different from one another despite their similar geologic settings.

That variety implies that almost any scenario is possible in another part of the Pacific Rim where quake risk is thought to be high — along the Cascadia subduction zone offshore of Washington, Oregon, and other parts of the western United States and Canada.

“We do not fully understand the limits of what can happen,” says Thorne Lay, a seismologist at the University of California, Santa Cruz. “We have to be broadly prepared to respond.”

Lay spoke on 21 October at the Geological Society of America meeting in Vancouver, Canada, a city on the front lines of Cascadia earthquake risk.

The last great quake in the region happened in 1700. Conventional wisdom holds that the next one, perhaps as large as magnitude 9, could strike at any time in the next several hundred years. Geologically speaking, Cascadia is a classic subduction zone, where one plate of Earth’s crust plunges beneath another, building up stress and occasionally relieving it in large earthquakes.

The recent spate of great subduction-zone quakes, of magnitude 8 or larger, began with the 2004 Sumatra earthquake. On average, each year since then has brought 1.8 great quakes, more than twice the rate of the previous century.

In large part, they happened where and when seismologists expected them. “The quakes are basically filling in a deficiency of activity,” Lay says. But their details have been surprising.

The 2004 Sumatra quake, for instance, ruptured unexpected portions of a subduction zone off Indonesia, where the fault zone bends as opposed to running straight. That implies that areas in Cascadia with unusual geometry might also be at risk, Lay says.

In 2007, in Peru, a major earthquake began to happen, then essentially stopped for 60 seconds before picking up again and eventually generating a large tsunami. That start-stop-start pattern raises challenges for Cascadia because seismologists are trying to develop an accurate earthquake early warning system there.

And in April 2014, a Chilean quake ruptured a far shorter portion of a subduction zone than scientists had expected. That suggests that researchers can’t be complacent about thinking they know which parts of Cascadia might break, Lay says. (The worst-case scenario for Cascadia involves a rupture of approximately 1,000 kilometres.)

That’s not to say scientists aren’t preparing. The recently launched M9 project, coordinated out of the University of Washington in Seattle, aims to help officials cope with the risk of a great Cascadia quake. At the Vancouver meeting, Arthur Frankel of the US Geological Survey in Seattle showed early results of calculations of where the ground might shake the most. Enclosed basins, like Seattle, amplify the shaking, he reported.

Geologists face off over Yukon frontier

Posted on behalf of Alexandra Witze. 

The walls of the Geological Survey of Canada’s Vancouver office are, not surprisingly, plastered with maps. There’s one of the country of Canada, one of the province of British Columbia, and even a circumpolar Arctic map centered on the North Pole.

IMG_1626

The Klondike schist of Canada (shown in green) stops at the border with the United States.

Alexandra Witze

All display that distinctive rainbow mélange so typical of professional geologic maps. Each major rock formation is represented by its own colour, so that pinks and purples and yellows swirl in great stretches representing mountain ranges, coastal plains, and every conceivable landscape in between.

But lying on the table of the survey’s main conference room is a much more problematic map. It shows part of the far northern boundary between the United States and Canada, along a stretch between Alaska and the Yukon territory. And the two sides, on either side of the international border, do not match.

It’s not a question of Canada using one set of colours for its map and the United States using another. The geology simply does not line up. To the east, Canadian mappers have sketched a formation called the Klondike schist, which is associated with the gold-rich rocks that fueled the Klondike gold rush in the late 1890s. To the west, US maps show nothing like it.

“We don’t know why,” says Jamey Jones, a geologist with the US Geological Survey (USGS) in Anchorage, Alaska. “We have got to figure out why these aren’t matching.”

He and two dozen scientists from both sides of the border — but clad equally in plaid shirts and hiking boots — met in Vancouver on 20 October to try to hammer out the discrepancies. For two hours they compared mapping strategies, laid out who needed to explore what next, and swapped tips about the best ways to get helicopters in the region.

The last frontier

At one level, the differing maps are a relatively minor academic point to sort out. Such glitches are fairly common whenever geologists have to match one ‘quadrangle’ mapped from one era or with one technique against another from a different time. And it’s not unusual for geology to not quite line up across international borders.

But American and Canadian geologists have reconciled their maps along nearly the entire northern stretch where Alaska and the Yukon meet, says Frederic “Ric” Wilson, a geologist with the USGS in Anchorage. This last bit is the only one that does not match — and it may well be because the Canadian maps are four years old, while the American ones are four decades old.

The US maps stretch back to the days of legendary geologist Helen Foster, who mapped large parts of Alaska after making her name as a post-war military geologist in former Japanese territories. “With her, you walked every single ridge,” recalls Wilson. “Every single ridge.”

All that walking produced maps of huge stretches of the remote Alaskan landscape. They include the 1970 quadrangle map now in question, which abuts a much newer Canadian quadrangle to the east. Together the maps span part of a massive geological feature known as the Yukon-Tanana Terrane, a collection of rocks caught up in the mighty smearing crush where the Pacific crustal plate collides against North America.

The Canadian side of the map is in good shape. Prompted in part by intense mining interest, geologists there have mapped the Klondike in modern detail.  “I’m willing to integrate any piece of data that comes in,” says Mo Colpron, a geologist with the Yukon Geological Survey. “If you guys come up with things that affect how our side of the border works, then we can sit down and talk and try to mesh it.”

That leaves the burden of work on the US side, to update the Foster maps. “The reconciliation project is what it’s called,” says Rick Saltus, a geologist with the USGS in Denver, Colorado, who served as meeting emcee. “We’re taking a three-year look at cross-border tectonic connections, because things look a little different from one side to the other.”

This summer, Jones and his colleagues hired a helicopter to take them everywhere the Foster maps ran up against the Klondike formation. “We’ve seen a lot of rocks we didn’t anticipate seeing,” he says. That data will go into the new and improved US maps.

There is, however, only so much scientists can do. Citing border regulations, Jones says, the helicopter pilot was unwilling to take them just a tiny bit over into Canada so they could see the geology on the Yukon side.

Doctor bets against traditional Chinese medicine

Beijing

The Beijing University of Chinese Medicine is one institution where the government promotes the practice.

BUCM

A sceptic of traditional Chinese medicine is challenging practitioners of the age-old craft to prove themselves by putting his own money on the line. One has accepted the challenge. At stake is the claim that practitioners can discern whether a woman is pregnant by her pulse.

Traditional Chinese medicine (TCM) is a point of contention in China. Although the government is keen to promote its use in the clinic and, in modernized form, as part of drug discovery, some feel that much of it is unproven and that the government is throwing its money away. There have also been high-profile cases of fraud linked to such research, and the practice is criticized for its dependence on endangered species such as the Saiga antelope (Saiga tatarica).

Ah Bao, the online nickname of a burn-care doctor at Beijing Jishuitan hospital, has been an adamant critic of TCM on Chinese social media, often referring to it as “fake”. He issued the challenge on 13 September, and Zhen Yang, a practitioner at the Beijing University of Traditional Medicine, took him up on it.

Ah Bao put up 50,000 yuan (more than US$8,000), and at his urging others have donated more than 50,000 yuan, making the prize worth more than 100,000 yuan total. Ah Bao turned down Nature‘s request to be interviewed, saying that he has been overwhelmed by media attention.

Yang will have to assess with 80% accuracy whether women are pregnant. The two are reportedly working out the terms of the contest, with a tentative set-up reportedly involving 32 women who would be separated by a screen from Yang.

Australia puts science in ‘competitiveness’ drive

ian-macfarlane_0

Minister for Industry Ian Macfarlane

Australian Department of Industry

The Australian government has unveiled plans to increase the commercial return on its billions in research funding and to pump more resources into boosting industry-science links.

The government appointed ten experts — five business leaders and five leading researchers — to a ‘Commonwealth Science Council’ to advise on science priorities and to become the “pre-eminent body for advice on science and technology” in Australia, according to the ‘competitiveness agenda’ released on 14 October.

The council will be chaired by Prime Minister Abbott, with Industry Minister Ian Macfarlane as deputy chair. It will replace an existing (and some say moribund) advisory group.

The statement also says that there will be a “sharpening” of incentives for collaboration between research and industry. Five new centres to improve collaboration, and increase the competitiveness of industries including mining, oil and medical technologies, will be set up at a cost of Aus$188.5 million (US$164 million).

The Abbott government has come in for fierce criticism over its perceived lack of support for science, with many government-funded researchers and science agencies facing cut backs (see ‘Australian cuts rile researchers’). Macfarlane has previously said that the competitiveness agenda would show how the government was dealing with these concerns, by setting science at the centre of industry policy.

Australia’s chief scientist Ian Chubb said that the new council would “provide the strategic thinking and direction that a national transformation truly demands” and also welcomed an Aus$12 million investment in science education. “This is about improving the impact, focus and prioritisation of Australia’s investment in science and research,” he said in a statement.

The Australian Academy of Science also welcomed the announcements. Its secretary of science policy Les Field said in a statement: “Anything which aligns science more closely with industry has got to be a big plus, especially when this is an area where Australia traditionally struggles.”

Tragedy strikes Taiwanese research ship

TAIWAN-ACCIDENT

The sinking of the Ocean Research V in an image from a video released by Taiwan’s Coast Guard.

Taiwan Coast Guard/AFP via Getty

Two scientists died on 11 October after the research vessel they were on, Taiwan’s Ocean Research V, capsized in the Taiwan Strait. Another 25 scientists and 18 crew members were rescued. 

The 73-metre, 2,700-tonne vessel, which had been operating only since February 2013, cost 1.5 billion new Taiwan dollars (US$50 million). It had three laboratories, sonar for seafloor mapping, multiple plankton samplers and other devices for comprehensive ocean exploration. It was built to carry out scientific and as well as resource surveys, including sampling sea-bed gas hydrates and offshore wind turbine sites.

The Ocean Research V was also equipped with a dynamic positioning system to enable it “to conduct highly precise action on sea even under strong winds in the situation of typhoon or strong monsoon“, according to the Taiwan Ocean Research Institute, which operated it. But on the night of 10 October, one day after setting sail, the ship capsized near Penghu island, some 50 kilometres off of Taiwan’s western coast. Some speculate that it hit a reef after being blown off course by strong winds related to a typhoon. 

Hsu Shih-chieh, a researcher at the Academia Sinica in Taipei, reportedly died after making efforts to save his fellow researchers. Lin Yi-chun, a scientist at the Taiwan Ocean Research Institute, also died.

The Ministry of Science and Technology is now investigating the cause of the accident.

Stem-cell fraud makes for box office success

Posted on behalf of David Cyranoski and Soo Bin Park

Fictionalized film follows fabricated findings

Stem cell fraudster faces down the journalist who debunks him in the film sweeping Korean cinemas.

Stem-cell fraudster faces down the journalist who debunks him in the film sweeping Korean cinemas.

Wannabe Fun

A movie based on the Woo Suk Hwang cloning scandal drew more than 100,000 viewers on its opening day (2 October) and has been topping box office sales in South Korea since then. With some of the country’s biggest stars, it has made a blockbuster out of a dismal episode in South Korean stem-cell research — and revealed the enduring tension surrounding it.

The movie, Whistleblower, shines a sympathetic light on Woo Suk Hwang, the professor who in 2004 and 2005 claimed to have created stem-cell lines from cloned human embryos. The achievement would have provided a means to make cells genetically identical to a patient’s own, and able to form almost any type of cell in the body. But hopes were shattered when Hwang’s claims turned out to be based on fraudulent data and unethical procurement of eggs. The whistleblower who revealed the fraud says the new movie strays far from reality.

“This topic is sensitive, so I was hesitant when I got the first offer,” said director Yim Soon-rye at the premiere on 16 September in Seoul. “I wanted to portray him [Lee Jang-hwan, Hwang’s character in the film] as a character who faces a very human problem, and to show there is room to understand his actions.”  Although clearly inspired by the real-life events surrounding Hwang and his cloning claims, the film does not aim to be a true representation of events, but a ‘restructured fiction’ created for a movie audience.

The movie broadly traces the scandal as it actually unravelled, tracing the process through which the stem-cell claims were debunked. Some changes are made, apparently for dramatic effect: Snuppy, the Afghan hound produced by cloning in Hwang’s laboratory, was converted into Molly, also an Afghan hound, but one with cancer. When Lee sees the writing on the wall, he is shown going to a Buddhist temple where he rubs Molly’s fur, saying “I came too far … I missed my chance to stop.”

Yim says he wanted the fraudster “to be interpreted multi-dimensionally, rather than as a simple fraud or evil person”.

But rather than the scientists, Yim put the perseverance of the reporter at the centre of the film, and ends up skewing relevant facts, says Young-Joon Ryu, the real whistleblower. Ryu, who had been a key figure in Hwang’s laboratory, says his own contributions and those of online bloggers were credited to the reporter. (The discovery that Hwang had unethically procured eggs, first reported in Nature, was also credited to the reporter.)

The film has refuelled anger in some Hwang supporters who believe, despite evidence to the contrary, that Hwang did have human-cloning capabilities and that the scandal deprived the country of a star scientist. They are back online calling Ryu a betrayer.

Ryu understands that a movie might emphasize “fast action, dramatic conflicts and famous actors” to increase box office revenues. But having suffered through one perversion of the truth as Hwang made his original claims, watching the film he says that he felt was witnessing another.

UK launches space weather forecast centre

The UK officially opened its first space weather forecasting centre this week.

Funding for the Met Office Space Weather Operations Centre, based at the organisation’s headquarters in Exeter, was announced by the government late last year.

Solar flare July 2012

NASA/Royal Observatory Belgium/SIDC

Since May the centre has been operating 24/7, ahead of its public launch on 8 October. As well as giving early warning of space weather threats to critical infrastructure, such as the National Grid, the Met Office now also provides publicly-available forecasts, published on its website.

‘Space weather’ is a term which covers how radiation and high-energy particles, ejected from magnetic storms in the Sun, interact with Earth’s magnetic field and impact terrestrial technology. Severe space weather can knock out satellite communications and disrupt global positioning systems (GPS) and power grids.

The centre came about following three years of discussion between the Met Office and its US counterpart, the National Oceanic and Atmospheric Administration’s National Weather Service, based in Boulder, Colorado, which was keen to establish a backup for their Space Weather Prediction Center (SWPC).

To determine how soon a solar event will be felt on Earth, forecasters at the SWPC and Met Office will use the same models, based on data from the same spacecraft. But by running the models at slightly different times, forecasters will be able to compare the results and generate a more accurate picture, says Catherine Burnett, space weather programme manager at the Met Office.  The UK’s centre will also use different ground-based data to hone its forecasts for the UK, she adds.

Speaking ahead of the official launch, Laura Furgione, deputy director at NOAA’s National Weather Service, said that accurately predicting and preparing for the impacts from space weather required “a commitment similar to terrestrial weather forecasting and preparedness”.