Our system detected that your browser is blocking advertisements on our site. Please help support FoxesTalk by disabling any kind of ad blocker while browsing this site. Thank you.
Jump to content
davieG

Technology, Science and the Environment.

Recommended Posts

We were wrong — worst effects of climate change can be avoided, say scientists

https://www.thetimes.co.uk/edition/news/we-were-wrong-worst-effects-of-climate-change-can-be-avoided-say-scientists-k9p5hg5l0

 

The worst impacts of climate change can still be avoided, senior scientists have said after revising their previous predictions.

 

The world has warmed more slowly than had been forecast by computer models, which were “on the hot side” and overstated the impact of emissions, a new study has found. Its projections suggest that the world has a better chance than previously claimed of meeting the goal set by the Paris agreement on climate change to limit warming to 1.5C above pre-industrial levels.

 

The study, published in the journal Nature Geoscience, makes clear that rapid reductions in emissions will still be required but suggests that the world has more time to make the changes.

 

Michael Grubb, professor of international energy and climate change at University College London and one of the study’s authors, admitted that his past prediction had been wrong.

 

He stated during the climate summit in Paris in December 2015: “All the evidence from the past 15 years leads me to conclude that actually delivering 1.5C is simply incompatible with democracy.”

 

Professor Grubb told The Times yesterday: “When the facts change, I change my mind, as [John Maynard] Keynes said. It’s still likely to be very difficult to achieve these kind of changes quickly enough but we are in a better place than I thought.”

The latest study found that a group of computer models used by the Intergovernmental Panel on Climate Change had predicted a more rapid temperature increase than had taken place. Global average temperature has risen by about 0.9C since pre-industrial times but there was a slowdown in the rate of warming for 15 years before 2014.

 

Myles Allen, professor of geosystem science at the University of Oxford and another author, said: “We haven’t seen that rapid acceleration in warming after 2000 that we see in the models. We haven’t seen that in the observations.”

 

He added that the group of about a dozen computer models, produced by government institutes and universities around the world, had been assembled a decade ago “so it’s not that surprising that it’s starting to divert a little bit from observations”. Too many of the models used “were on the hot side”, meaning they forecast too much warming.

 

According to the models, keeping the average temperature increase below 1.5C would mean that the world could emit only about 70 billion tonnes of carbon after 2015. At the present rate of emissions, this “carbon budget” would be used up in three to five years. Under the new assessment, the world can emit another 240 billion tonnes and still have a reasonable chance of keeping the temperature increase below 1.5C.

“That’s about 20 years of emissions before temperatures are likely to cross 1.5C,” Professor Allen said. “It’s the difference between being not doable and being just doable.”

 

Professor Grubb said that the fresh assessment was good news for island states in the Pacific, such as the Marshall Islands and Tuvalu, which could be inundated by rising seas if the average temperature rose by more than 1.5C.

Other factors pointed to more optimism on climate change, including China reducing its growth in emissions much faster than predicted and the cost of offshore windfarms falling steeply in Britain.

 

Professor Grubb called on governments to commit themselves to steeper cuts in emissions than they had pledged under the Paris agreement to keep warming below 1.5C. He added: “We’re in the midst of an energy revolution and it’s happening faster than we thought, which makes it much more credible for governments to tighten the offer they put on the table at Paris.”

 

The Met Office acknowledged yesterday a 15-year slowdown in the rise in average temperature but said that this pause had ended in 2014, the first of three record warm years. The slowing had been caused by the Pacific Decadal Oscillation, a pattern of warm and cool phases in Pacific sea-surface temperature, it said.

 

Analysis

When 194 nations met in Paris in 2015 and agreed to try to limit the increase in global average temperature to 1.5C, many scientists dismissed the goal as unattainable (Ben Webster writes).

 

They said it would be politically and economically impossible to cut emissions fast enough and that the world would have to prepare for worse droughts and heatwaves and islands disappearing beneath rising seas.

 

Now it turns out the scientists were being too pessimistic and had been led astray by computer models.

 

Other factors have also contributed to the new, more optimistic assessment, including the cost of renewable energy and China’s emissions growth both falling faster than almost anyone had predicted.

 

Computer models remain the best way to work out how quickly we need to cut emissions to avoid climate change, but scientists could be nimbler at revising them when actual readings diverge from predictions.

Link to comment
Share on other sites

Interesting article - par for the course for the Times.

 

Personally, I would agree that keeping things below 1.5 C is possible given this new information and that a renewable energy revolution being behind that is a really good thing. I never believed that change was inevitable - it would only happen if humans pretended nothing was wrong and did nothing. Thankfully, it looks like that might be the case.

Link to comment
Share on other sites

Some of these numbers are incomprehensible but interesting all the same.

 

The Race to Build a Computer Powerful Enough to Predict the Future

http://www.slate.com/articles/technology/future_tense/2017/09/exascale_computers_will_be_fast_enough_to_predict_the_future.html

 

In June, for the first time in two decades, the United States did not operate one of the top three most powerful computers in the world. Instead, China took the highest two slots, and Switzerland came in third, according to the Top500 list, a global ranking of the most powerful supercomputers on the planet.

 

The two fastest supercomputers from China clock in at 93 and 33 petaflops. A petaflop is a unit of measuring computer performance that translates to 1,000,000,000,000,000 calculations per second. But even China’s 93 petaflop machine is slower than the supercomputer that the U.S., Japan, and other competing nations want to build—because what these countries really want is to build the world’s first exascale computer. An exaflop is 1,000 petaflops, and there’s no computer that powerful in the world now. Not even close. For perspective, most consumer laptops operate at gigascale speeds, which is 1 billion calculations per second. An exascale computer is a trillion times faster than that.

 

So what’s the point of that? The more powerful the computer, the more realistic the models it can create. Supercomputers are already used to predict weather and earthquakes, but there’s not currently enough computing power to model complex biological systems precisely enough to make endeavors like large-scale transitioning to wind energy, for example, feasible. An exascale computer would be powerful enough to uncover answers to questions about, say, climate change and growing food that can withstand drought. It could even predict crime (hopefully with more accuracy and fairness than current predictive policing systems).

 

Building an exascale computer is a national-level project. And earlier this summer, the U.S. Department of Energy shelled out $258 million to six different companies—Hewlett-Packard, Intel, Nvidia, Advanced Micro Devices, Cray, and IBM—all working on the components that would one day go into building such a system.* “There is no single company that can afford to do this, and even a consortium of companies would not be able to do it,” said Thom Dunning, a chemistry professor at the University of Washington and the co-director of the Northwest Institute for Advanced Computing. Which makes the exascale project a perfect example of why government funding of science is so important. The firms awarded the funds will cover at least 40 percent of the cost of the research themselves.

 

Japan and China both have initiatives to build exascale computer systems, too, and the nation that does it first will unlock all kinds of ways of predicting the future and understanding the present. It could put that country far ahead of the rest of the world in terms of scientific and technological achievement, which in turn translates to economic power.

 

Take the problem of transitioning to more wind energy. At the moment only about 5 percent of U.S. energy needs are met through wind power. That’s because wind farms aren’t always more cost effective than fossil fuels when factoring out subsidies, even if wind energy is ultimately better for the environment. And that’s why, according to a recent paper by the National Renewable Energy Laboratory, the DOE has dubbed the effort to improve the efficiency of wind power plants a national “grand challenge” that requires “the world’s largest computers and advanced computationally efficient algorithms to resolve.” In other words, with better computing, researchers will be able to accurately model how wind flows through a plant. That know-how will filter directly into better industrial designs and cost reduction of sustainable energy systems. As part of the Department of Energy’s larger exascale project, the NREL is working to build predictive wind energy models that can work on an exascale-level machine by 2022.

 

Stronger computers mean a better understanding of how we engineer a more sustainable future. The same goes for creating drought-resistant plants or biofuels in the future, which is what Dunning’s research team is working on. Understanding why plants stress out in droughts, says Dunning, has to do with the way ions travel across a cell membrane. And to model that process with the kind of detail needed to make accurate predictions, “you’ve got to worry about the membrane, the ions, and everything that’s inside the cell.” That takes a lot of computer power. “But if you can understand that process better, you may be able to engineer crops that are used for biofuels, as opposed to food, or you could engineer them to better respond to droughts,” Dunning explained.

 

Exascale computing power would also allow for the federal agencies tasked with making sense of surveillance data, like the National Security Agency and the FBI, to actually analyze the massive amount of information they sweep up in their dragnet global digital surveillance operations. That data is currently stored at a network of data centers across the country, but it’s not necessarily being analyzed at a rate fast enough to thwart attacks. With an exascale system, paired with exascale-level software, law enforcement could scan social media in real time. That, coupled with other data sources, could ostensibly more accurately predict when someone is about to commit a crime and try to stop him. While homed-in surveillance and predictive policing might be one of the more disturbing uses of these systems, national security is one of the key motivations behind exascale research. And like all technologies, it can be used in positive ways, like to promote sustainable energy, and potentially nefarious ways, like deepening state surveillance.

 

The DOE hopes to get the first exascale computer working in the field by 2021. But in order for that computer to work in a meaningful way, it will need to run software that can handle the processing in an ethical and efficient way, too. And considering how biased and at times racist software can be in today’s systems, which are relatively prehistoric in capacity, it’s not going to be easy to build. Still, one of the hallmarks of the DOE’s work is a strong focus on software andcomputing power, which sets the U.S. apart from China, which, according to Dunning, is overly focused on hardware.

 

Currently, the fastest computer in the U.S., Titan, runs at about 18 petaflops per second, but by next year, the government is expecting to unveil a new supercomputer that, at peak performance, will run about 200 petaflops per second. That computer, Summit, is located at the Oak Ridge National Laboratory in Tennessee. That’s a huge jump in capacity, but China is pursuing its exascale computing project aggressively too. And if the U.S. does hope to be the first to capture that level of computing power—a level so high that it can model the present and predict the future—then the $258 million the DOE invested this summer was the right move. Perhaps it’s the nationalist nature of the effort that inspires President Trump to continue government funding of this scientific research, or its ties to national security. But whatever the reason is, it’s exactly the type of scientific research that depends on government-level funding. And the future of our future depends on getting this right.

Link to comment
Share on other sites

Interesting paper on attitudes towards science from both sides of the political spectrum.

 

http://journals.sagepub.com/doi/abs/10.1177/1948550617731500?journalCode=sppa&

 

I will, however, add my own twopennyworth as an addendum and say that IMO science denial on the right is a bigger problem than science denial on the left right now - not because conservatives engage in it more (the study is pretty conclusive on the evenness of such denial), but because the particular issues where conservative views clash with the science are (often) much more important issues for the future as a whole. Specifically, climate change denial is much more dangerous than any other mainstream politically motivated science denial, since climate change is a possible - if not probable - existential threat.

Link to comment
Share on other sites

1 hour ago, leicsmac said:

Interesting paper on attitudes towards science from both sides of the political spectrum.

 

http://journals.sagepub.com/doi/abs/10.1177/1948550617731500?journalCode=sppa&

 

I will, however, add my own twopennyworth as an addendum and say that IMO science denial on the right is a bigger problem than science denial on the left right now - not because conservatives engage in it more (the study is pretty conclusive on the evenness of such denial), but because the particular issues where conservative views clash with the science are (often) much more important issues for the future as a whole. Specifically, climate change denial is much more dangerous than any other mainstream politically motivated science denial, since climate change is a possible - if not probable - existential threat.

You use the term denial like it's a proven fact and not just a scientific theory. Scepticism is a better phrase.

Link to comment
Share on other sites

59 minutes ago, Webbo said:

You use the term denial like it's a proven fact and not just a scientific theory. Scepticism is a better phrase.

Increased concentrations of CO2 and gradual increase in global temperature since 1800 are facts not in question. What changes this may bring is very much open for debate however.

 

Though, of course, I'm thinking the "deniers" don't deny that things are changing in that way - they are merely skeptical of both human involvement and the degree of changes such will bring.

 

Perhaps you could use the term "apathy towards the future" rather than "denial" in that way...because I believe that to be true and that apathy is as equally dangerous in this regard as flat-out denial.

Link to comment
Share on other sites

Couple of interesting tidbits for today:

 

http://www.bbc.com/news/science-environment-41279470

 

Basically, we kill the big things because we covet them and the small ones because we ignore them.

 

And...

 

https://futurism.com/china-claims-they-have-actually-created-an-em-drive/

 

Though it's dubious as it's been claimed before, could be exciting stuff from China.

Link to comment
Share on other sites

Link to comment
Share on other sites

  • 1 month later...

It's the Indy so the term is rather too "sky-is-falling-everyone-panic" for my liking, but the statistics are pretty sobering. Whether humans are causing climate change or no, the other things we're doing certainly are having a negative effect.

 

http://www.independent.co.uk/environment/letter-to-humanity-warning-climate-change-global-warming-scientists-union-concerned-a8052481.html

Link to comment
Share on other sites

Interesting article from the Grauniad on the advancement of AI and how it might be applied for autonomous weaponry:

 

https://www.theguardian.com/technology/2017/nov/15/im-a-pacifist-so-why-dont-i-support-the-campaign-to-stop-killer-robots

 

A friend said something regarding this that pretty much represents my view on the matter: "A while back, I listened to a presentation about the ability to control technology. From a review of various historical attempts to ban technologies, the speaker's conclusions were that technology bans work when the infrastructure for the technology in large and visible; when nobody really wants the technology in the first place (e.g. reproductive cloning); and when there's a single point of decision/enforcement. Bans fail when the technology is diffusible, when there are strong incentives to get it, and when there are multiple stakeholders involved.

Autonomous weapons fall into the "bans fail when" category in every regard."

 

Is this a future threat to be considered? If so, how can it be contained, if indeed it can?

Edited by leicsmac
Link to comment
Share on other sites

30 minutes ago, leicsmac said:

Interesting article from the Grauniad on the advancement of AI and how it might be applied for autonomous weaponry:

 

https://www.theguardian.com/technology/2017/nov/15/im-a-pacifist-so-why-dont-i-support-the-campaign-to-stop-killer-robots

 

A friend said something regarding this that pretty much represents my view on the matter: "A while back, I listened to a presentation about the ability to control technology. From a review of various historical attempts to ban technologies, the speaker's conclusions were that technology bans work when the infrastructure for the technology in large and visible; when nobody really wants the technology in the first place (e.g. reproductive cloning); and when there's a single point of decision/enforcement. Bans fail when the technology is diffusible, when there are strong incentives to get it, and when there are multiple stakeholders involved.

Autonomous weapons fall into the "bans fail when" category in every regard."

 

Is this a future threat to be considered? If so, how can it be contained, if indeed it can?

face recognition goes down the pan if you wear a burka.

Link to comment
Share on other sites

  • 4 weeks later...
On 20/09/2017 at 15:45, Webbo said:

You use the term denial like it's a proven fact and not just a scientific theory. Scepticism is a better phrase.

Over 97% of climate scientists are in agreement that man is causing environmental changes that are warming the planet and threatening our survival. The other <3% (and bear in mind some climate science is paid for by the fossil fuel industry) are for some reason given platforms to make the consensus look less clear. No non climate scientist has any reasonable grounds for being sceptical.

Link to comment
Share on other sites

  • 1 month later...

http://www.bbc.com/news/science-environment-42736397

 

How much longer can folks really be skeptical?

 

That being said, the pertinent part is right at the end - “Rather than warming being inconsequential or catastrophic, as some have suggested, we can be sure societies are facing a dangerous temperature rise, but one which we still have time to fix. The conclusions confirm that human-caused climate change is a serious concern. But if we act now with sustained and substantial cuts in greenhouse gas emissions, societies will still be able to avoid much of the most dangerous climate change predicted by computer simulations.”

 

We can still mitigate the effects of what's to come - if the will is there.

Link to comment
Share on other sites

34 minutes ago, leicsmac said:

http://www.bbc.com/news/science-environment-42736397

 

How much longer can folks really be skeptical?

 

That being said, the pertinent part is right at the end - “Rather than warming being inconsequential or catastrophic, as some have suggested, we can be sure societies are facing a dangerous temperature rise, but one which we still have time to fix. The conclusions confirm that human-caused climate change is a serious concern. But if we act now with sustained and substantial cuts in greenhouse gas emissions, societies will still be able to avoid much of the most dangerous climate change predicted by computer simulations.”

 

We can still mitigate the effects of what's to come - if the will is there.

It’s cold here though.

penguins.

Link to comment
Share on other sites

11 minutes ago, Buce said:

 

So, the plan is to take co2 from the air and pollute the world with even more non-biodegradable plastic?

 

Brilliant.

 

 

:D Well I think it's to make plastic - that people want - from earth threatening CO2 using renewable energies and therefore having a net positive effect but of course you could be correct.

Link to comment
Share on other sites

This thread is way above my head but I heard yesterday that in the future, recycled plastic could be 'ground down' and mixed with Bitumen etc. to be used for resurfacing roads. Apparently plastic has some materials in it that would prolong road surfaces, reduce potholes and generally help the longevity of our roads. Not sure how much truth in it but sounded like a smart idea??

Link to comment
Share on other sites

27 minutes ago, Izzy Muzzett said:

This thread is way above my head but I heard yesterday that in the future, recycled plastic could be 'ground down' and mixed with Bitumen etc. to be used for resurfacing roads. Apparently plastic has some materials in it that would prolong road surfaces, reduce potholes and generally help the longevity of our roads. Not sure how much truth in it but sounded like a smart idea??

Structurally it can't be any worse than the rubbish they're laying down at the moment - It only seems to take a couple of weeks for a freshly resurfaced road to wear away into a hazardous obstacle course, especially where you have manholes in the middle of major thoroughfares.

  • Like 1
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...