3 Ways Big Data, Supercomputing Change Weather Forecasting

"You can't control the weather." It's a phrase we often utter as we plan weddings, family picnics, and beach vacations. While it's true that we can't control the weather, scientists and technologists are working hard to improve our ability to predict it.


What that means for enterprise IT isn't just the ability to foresee disasters that could wipe out a data center. Granular and early forecasts can create business opportunities -- for example, alerting a brewery to supply distributors and points-of-sale greater inventory in anticipation of an unusually warm spring weekend in the Northeast. Or suppose a blizzard is due to hit Atlanta on Black Friday -- with enough notice, retailers could adjust their plans.

Beyond the burgeoning data services industry, weather has massive economic and safety implications. Weather Analytics, a company that provides climate data, estimates that weather impacts more than 33% of worldwide GDP, affecting the agriculture, tourism, fishing, recreation, and airline industries, to name just a few.

Uncertain weather conditions also impact small business owners, such as the local painter who can't complete his job on time when foul weather persists. In addition, public safety is of vital concern when officials aim to understand the impact of extreme weather events such as hurricanes, tsunamis, or wildfires. Costs associated with extreme weather across the world totaled more $125 billion in 2013, and the frequency of these events is on the rise.

Beyond the private sector, without specific data about a forthcoming event, governments may waste money casting a net that's too wide. For example, every additional mile of evacuation associated with an impending storm can result in millions of dollars in lost revenue, wages, and relocation expenses. And hospitals and emergency facilities that anticipate a severe storm that could shut down power could stock up on extra fuel for generators.

So while we're not able to control the weather, better forecasting will allow us to make more informed plans that can limit financial losses, provide new business opportunities, reduce government spending, and even save lives.
Unfortunately, improving our ability to predict the weather is challenging, both scientifically and computationally. Supercomputing has played a major role in enabling predictive models since the 1950s and remains at the cornerstone of today's weather and climate modeling. Constantly improving computational capabilities have allowed scientists and forecasters to produce results faster than ever while also investigating increasingly complex phenomena and producing specialized forecast products. From model performance to system and data management, weather prediction presents unique high-performance computing challenges.
[What percentage of your data center budget will be spent on maintenance vs. innovation in 2014? How about 2015? Tell us and enter to win a 32-GB Kindle Fire HDX.]

Supercomputing, along with big data, can meet the future demands of weather forecasting in three key areas:
1. Managing and utilizing enormous data sets: The volume and diversity of environmental data is increasing exponentially, placing great demand on the infrastructure to transport, manage, and store this data, and requiring ever-greater computational power for simulations that use it. This creates new opportunities for specialized services, developed with researchers in public and private institutions. One example is leveraging new sources of observation, such as sensors placed on automobiles. Imagine thousands of sensors in an urban area providing real-time meteorological information. Models are also evolving to analyze this tsunami of data and augment traditional physics-based simulations.

2. Increasing model resolution: Higher-resolution models are a critical element to better estimate the long-term state of climate systems and to improve weather forecasting, particularly for severe weather events. Recent simulations of Hurricane Sandy by researchers at the National Center for Atmospheric Research and the University of Illinois using Blue Waters supercomputers have zeroed in to a 500-meter resolution -- the equivalent of a few city blocks.

3. Addressing technology hurdles: As weather modeling and analytics become more data-intensive and computationally demanding, researchers must watch for performance bottlenecks such as memory, I/O, and interconnect latencies and bandwidths. Weather simulation requires thousands of microprocessors to run in parallel, pushing hardware and software to its scalability limits. In addition, scalable operating systems, compilers, and application libraries play an essential role in achieving sustained performance. Ultimately, the underlying technology infrastructure must be tightly integrated to support simulation and analytics workflows.
Infrastructures offering simulation and data-driven analytics capabilities to support routine execution of high-resolution forecasts will combine with advanced research to promote a whole new array of specialized meteorological services for public and private sectors. The future of weather forecasting requires capabilities we couldn't even conceive of when we began predicting the weather 64 years ago. Supercomputing innovation has so far kept pace with the demands of the community, and it is poised to offer new solutions in the years to come.

More..

No comments:

Post a Comment