Back to top

Themes


  • Improve how Numerical Weather Prediction models work (numerics, dynamics, physics)
  • Use mountain-weather observations to improve the forecast initial conditions (data assimilation)
  • Forecast the spread of air pollutants for air quality and climate (atmospheric dispersion)
  • Forecast avalances, flooding, and stream flow into reservoirs (hydrometeorology)
  • Predict forecast uncertainty and the range of possible outcomes (probabilistic forecasting)
  • Predict weather and turbulence in the lower atmosphere where people live (boundary layer meteorology)
  • Improve the understanding of atmospheric temperature, humidity, and clouds (thermodynamics)
  • Focus on the complex terrain of western Canada (mountain & coastal mesoscale meteorology)
  • Increase efficiency of clean electric generation, transmission and use (wind & hydro power, load forecasts)

Wildfire smoke research and prediction

Wildfire smoke research and prediction

Visible satellite image from 2020-09-12

Learn more!

For smoke forecast models (such as BlueSky Canada) to provide useful information in a timely manner, they must generalize, or parameterize, complex wildfire-atmosphere interactions. The development of such parameterizations requires a deep understanding of these numerous physical processes. Our group utilizes large eddy simulation (LES) models, augmented with field observations at controlled burns, to investigate these processes.

The approach is synergistic: LES simulations influence our decisions on how to observe a controlled burn, while the data collected and analyzed from controlled burns refines our wildfire and smoke modelling. So far, this work has led to a new smoke plume rise parameterization scheme that we will integrate into our BlueSky Canada model, and the development of a new low-cost smoke sensor that we plan to deploy at future controlled burns.

Wildfire smoke research and prediction

Visible satellite image from 2020-09-12

by N. Moisseeva, C. Rodell, R. Howard, and R. Stull

Wildfire smoke is a complex and dynamic pollutant. As wildfires become more frequent and intense under the changing global climate, smoke pollution is quickly emerging as one of the key issues facing air quality in the coming decades. Our ability to predict where and how smoke travels is crucial to mitigating its negative impacts for human health and the environment.

Intense heat above the fire creates updrafts, which simultaneously mix with and modify the ambient environment. These turbulent columns of hot air mixed with fire emissions are referred to as a plumes. The term plume rise is typically used to describe the initial buoyant phase of a smoke plume, which determines how high in the atmosphere the pollutants will travel. Due to vertical wind shear, small errors in plume rise predictions can have profound consequences for downwind dispersion and forecast smoke concentrations at the earth’s surface. This is why plume rise is often the pivotal point in smoke modelling process.

We use Large-Eddy Simulations (LES) to gain insights into how fire and atmospheric conditions influence smoke plume behaviour. We analyze various dynamical feedbacks that exist between the fire and the ambient air and develop new analytical methods for predicting the vertical distribution of wildfire emissions in the atmosphere. The broad goal of our work is to improve the accuracy of our smoke forecasts.

Weather forecasting on the Google Cloud

Weather forecasting on the Google Cloud

Photo credit: J. Jeworrek

Learn more!

Leveraging cloud-computing resources for affordable, real-time weather forecasting

by T. Chui, D. Siuta, H. Modzelewski, G. West, R. Schigas, and R. Stull

Numerical weather prediction (NWP) models are computationally intensive pieces of software, requiring lots of computing resources (i.e. compute cores, RAM) to run. Conventionally, NWP simulations for research or real-time weather forecasting applications have been run on supercomputers, formally known as on-premise high-performance computing (HPC) resources. These on-premise resources are very expensive to acquire or maintain, often beyond the reach of most researchers and forecasters without extensive grants or capital. Upgrades to these HPC systems in the form of software licensing or additional hardware updates incur additional costs. Most users are thus forced to share these resources with many others by using queues, limiting on-demand flexibility-of-use.

Cloud-computing resources, like those hosted by the Google Cloud Platform (GCP) or Amazon Web Services (AWS), provide an alternate avenue for researchers and forecasters requiring computer power. Users pay to use compute resources on these cloud systems as needed, with all hardware upgrades (e.g. processor architecture) and physical maintenance handled entirely by the companies themselves. This provides great flexibility to users, in that they do not need to worry about sharing queues with other users or being restricted by the hardware of their single on-premise system. The flexible payment model means that users can better plan their budget depending on their applications, without the upfront capital purchase of an on-premise cluster.

Tim has done research in leveraging the GCP-hosted cloud system to run WRF and MPAS simulations for research and real-time forecasting. Through various cost- and performance-optimizations, the GCP is now a reliable and affordable computing system for our research team. In addition to supporting’s Tim’s research with MPAS modelling and mesh prototyping, the GCP also contributes to the team’s real-time Short-Range Ensemble Forecast (SREF), as well as providing high-resolution North America-wide weather forecasts for the Bluesky smoke forecasting system.

Weather forecasting on the Google Cloud

Photo credit: J. Jeworrek

Dynamical Downscaling of Climate Models

Dynamical Downscaling of Climate Models

Photo credit: C. Rodell

Learn more!

by L. Buchart, E. Gnegy, T. Chui, R. Howard, and R. Stull

Alongside operational and research-based weather forecasts, our group is exploring dynamical downscaling of climate models over western Canada. This is done by incorporating global climate models (GCMs) into the WRF model and leveraging the available nesting capabilities. GCMS are typically coarse; nesting brings these global models to regional scales. Downscaling to kilometer-scales allows the study of weather phenomena which are related to local effects (topography, mountain flows, etc.). Such modeling can support decision making in regards to decadal-scale fire weather, precipitation patterns, and coastal climates.

Our group is currently working on two dynamical downscaling related projects: one focused on fire weather in the Pacific Northwest and another project focused on weather in the Salish Sea region. Fire weather is poorly resolved by GCMs due to the coupling of temperature to elevation and moisture to topography characteristics. Our group has partnered with PCIC (Pacific Climate Impacts Consortium, University of Victoria) to model future fire weather at a ‘scale-free’ resolution across the Pacific Northwest. The goal of this work is to gauge the potential changes in the aerial extent and overall risk of wildland fires. Additionally, our group is working on an ongoing project that will serve as input to drive ocean and wave models from the Department of Fisheries and Oceans. This will support the management of local ecosystems and fisheries throughout the Salish Sea.

Dynamical Downscaling of Climate Models

Photo credit: C. Rodell

Modelling on variable-resolution meshes

Modelling on variable-resolution meshes

Photo credit: J. Jeworrek

Modelling on variable-resolution meshes

Photo credit: J. Jeworrek

Improving weather forecasts using variable-resolution Voronoi meshes

by T. Chui, H. Modzelewski, R. Howard, R. Schigas, P. Austin and R. Stull

MPAS temperature forecast with customized mesh.

All weather models use a grid, or mesh, upon which the forecasting equations are solved numerically. One type of mesh, when applied to the sphere for global weather forecasting, is called the Spherical Centroidal Voronoi Tessellation (SCVT). The Model for Prediction Across Scales (MPAS) is a global model that uses the SCVT to forecast the weather and climate.

The advantage of using an SCVT is the ability to define a variable-resolution mesh, such that the grid-spacing of the model can be transitioned seamlessly from synoptic to convective scale, allowing a modeler to focus on particular areas of interest at high resolution. This smooth refinement also removes the need to nest finer grids within coarser ones (like in the Weather Research and Forecasting [WRF] model), thus bypassing numerical noise due to interpolations from nesting.

However, such SCVTs take a long time to generate, especially ones with many mesh cells. Additionally, forecast quality and numerical stability are dependent on the shapes of the generated mesh cells (i.e. they should primarily be hexagonal, with acute interior angles). Conventional generation methods do not guarantee mesh quality.

Tim’s research focuses on ways to improve forecast quality on such SCVT meshes, by experimenting with unconventional methods to generate these meshes quickly, while making sure that cell quality is “good enough” for forecasting. His aim is to improve MPAS’s forecasting skill for mesoscale simulations, especially over the complex terrain of western Canada.

Artificial neural networks for short term electric load forecasting

Artificial neural networks for short term electric load forecasting

Photo credit: G. Warne

Artificial neural networks for short term electric load forecasting

Photo credit: G. Warne

by E. Wicksteed, G. West, and R. Stull

Structure of the multi-layer perceptron artificial neural network with 1 hidden layer. The layers are connected by activation functions. Neurons in the hidden layer are labelled 1 to n, where n is the total number of hidden neurons. Bias neurons are added to the input and hidden layers.

Short term load forecasting is the prediction of electricity use for hour- to week-ahead time horizons. Electric utility companies use short term load forecasts (STLFs) in their daily operations to match generation with anticipated load. Inaccurate forecasts can be expensive for companies if demand is higher or lower than expected. Load forecasting is challenging because electricity demand is dependent on human behaviour and weather, with temperature being the weather variable most commonly used as input to STLF models.

Our research combined the use of numerical weather prediction (NWP) model output and machine learning methods, with the aim to improve on the current load forecasting model used in operation by BC Hydro. Their current model uses Vancouver temperature as the only weather input variable to forecast load across the province of British Columbia (BC), Canada. We attempted to account for weather patterns across BC by exploring the use of gridded NWP output in a multi-layer perceptron (MLP) artificial neural network (ANN).

We used model output from the GEFS reforecast model and used gridded temperature across BC as input to an ANN, along with other important predictor variables (day of the week, hour of the day, month, and previous load values). We compared this model to an ANN using only Vancouver temperature (instead of temperature across the province), which is similar to the model used in operation. The model using Vancouver temperature worked much better than the one using gridded temperature. This was likely due to the large number of input variables in the ANN when using the gridded variables. Because of these results, we explored a number of other ways of using NWP output to create a STFL model. The experiments differed by their input variables, or the number of hidden layers in the MLP. All the experiments (including the two already mentioned) have the following input weather variables or MLP structure:

  1. point temperature for Vancouver, mimicking BC Hydro’s operational model;
  2. gridded temperature for BC;
  3. gridded temperature, humidity, precipitation, precipitable water, snow depth, and wind speed for BC;
  4. point temperature for five major BC load centres: Vancouver, Victoria, Abbotsford, Kelowna, Prince George;
  5. as in experiment 1, but with a two hidden layer MLP, rather than one;
  6. as in experiment 2, but with a two hidden layer MLP; and
  7. an ensemble method using weather model ensemble member temperature point forecasts for Vancouver.

Of note are the ensemble model (7), and the two-hidden-layer Vancouver model (5), which performed the best for both hour ahead and 7-day load forecasts, with the ensemble model performing best overall. In both cases where two hidden layers were used in the MLP rather than one, model performance improved. From all the results, it was clear that using gridded temperature inputs did not improve model performance, and the current method of using only Vancouver temperature input works well to forecast load.

Despite the current BC Hydro method working fairly well, implications from this research are that load forecasts for BC could be improved by using an ensemble load forecasting model, or using a MLP with two layers rather than one. An ensemble method in particular would be the most useful as it could also be used for probabilistic load forecasting, which may provide information on model uncertainty and help load forecasters gain confidence (or lack of) in the STLF model.

Statistical and machine learning methods for precipitation forecast

Statistical and machine learning methods for precipitation forecast

Photo credit: R. Steinhart

Learn more!

Precipitation forecast enhancement with a hybrid of analog ensemble, Schaake shuffle and convolutional neural networks

by Yingkai Sha, D.J. Gagne, G. West, and R. Stull

Extreme rainfall and flood events are threats to public safety and place great stress on hydroelectric dams. British Columbia has a complicated geographical environment; skillful precipitation forecasts across islands, mountains and valleys are difficult to create. Facing this challenge, a set of highly localized precipitation forecast products are in production to serve the demand of BC Hydro on maintaining and planning their facilities.

Our production pipeline begins with the public-oriented forecasts released by the US National Center for Environmental Predictions (NCEP), and receives contributions from state-of-the-art statistical and machine learning methods – analog ensemble, Schaake shuffle, and convolutional neural networks. Our product covers the Campbell River (Vancouver Island), Upper Columbia River (southern interior BC), and Peace River (Northeast BC), specifically targeting extreme events, and have high resolutions in space and time to serve our state holders with quick response.

Many parts of this work are at laboratory levels. However, we are actively collaborating with BC Hydro for their operation and maintenance in the near future.

Statistical and machine learning methods for precipitation forecast

Photo credit: R. Steinhart

Analog ensembles for sub-daily short-range precipitation predictions

Analog ensembles for sub-daily short-range precipitation predictions

Photo credit: J. Jeworrek

Learn more!

by J. Jeworrek, G. West, and R. Stull

Local weather patterns tend to reoccur in similar (though, not identical) ways. This concept can be used to make predictions based upon the collected knowledge of past forecasts and their matching observations. After investigating precipitation predictability in our region of interest with various model configurations, only the best performing models are used to generate an analog ensemble. From an archive of past model runs with these configurations, a set of “analog” past model forecasts is selected depending on the similarity in key predictor variables with the target model forecast. The matching past verifying observations compose the analog ensemble forecast for the target time. This technique is sensitive to several tuning parameters, such as the choice of predictors, similarity measure, and ensemble size. Our research seeks a methodology for locally optimized precipitation analog forecasts at over 50 station locations in southwest BC.

Analog ensembles for sub-daily short-range precipitation predictions

Photo credit: J. Jeworrek

Improving model physics

Improving model physics

Photo credit: J. Jeworrek

Learn more!

A bulk microphysics scheme informed by ground-based dual-polarization radar observations

by A. Di Stefano, P. Austin, G. West, and R. Stull

In numerical weather prediction (NWP) models, microphysics schemes control the growth and evolution of clouds and precipitation, which occur at scales too small for models to resolve. Despite their wide application, many microphysics schemes were developed using field observations from individual case studies. As such, they are often regionally-biased, subject to observation errors and unable to accurately represent certain microphysical processes. Verification of microphysics schemes using a reliable observation set would unveil the model microphysical processes most affected by these errors. Improvements can then be made to increase the overall accuracy of microphysics schemes, with the hope that these would better simulate clouds and precipitation.

Dual-polarization radar has become an important source of high-quality microphysical observations. By taking advantage of the electromagnetic properties of hydrometeors (water particles such as raindrops and ice crystals), dual-polarization radar can capture the distributions of many hydrometeors in the atmosphere at a given time. This information can be compared with high-resolution NWP model output to determine how well microphysics schemes are simulating these hydrometeors.

At present, we are exploring significant precipitation events in the Pacific Northwest. We are running NWP case studies in Washington state, USA, and pulling dual-polarization radar data in the same region for comparison. To bridge the weather and radar datasets, we are employing a NASA-developed software tool called POLARRIS, which converts high-resolution NWP output into radar data so that it can be verified against observations.

Improving model physics

Photo credit: J. Jeworrek

Improving wind speed forecasts

Improving wind speed forecasts

Photo credit: Dan Toulgoet (The Canadian Press)

Learn more!

by B. Jansens, G. West, and R. Stull

Strong winds are a significant weather hazard, and accurate predictions of the time, location, intensity, and direction of winds are of crucial importance for public safety. Windstorms can cause extensive damage to property and power infrastructure, and anticipating these dangers ahead of time can enable public services and utility companies to respond more quickly and efficiently. However, the wind forecasts produced by modern numerical weather prediction (NWP) models often struggle to make accurate and precise predictions of wind speed. This is particularly true in complex terrain, where the topography can strongly influence surface wind speeds in ways that are often not captured by NWP models.

This research is focused on trying to create improved wind forecasts for strong wind events in British Columbia (BC). The basic approach to this problem is to use downscaling, a process of using information contained in better-predicted, lower-resolution data to derive information about higher-resolution data on finer spatial scales. Neither of the two primary downscaling methods (dynamical and statistical) has proven effective at dealing with this problem on its own, however. Therefore, in this research we are using a hybrid dynamical-statistical downscaling scheme to try to improve wind speed forecasts for strong wind events in BC. This will use a combination of methods to optimize the dynamical model (e.g. using the best choice of parametrization schemes in the NWP model and working to improve terrain representation in the model) while at the same time using more sophisticated machine learning techniques to try and improve upon the statistical downscaling work done in the literature.

Improving wind speed forecasts

Photo credit: Dan Toulgoet (The Canadian Press)

Improving the weather forecast initial conditions

Improving the weather forecast initial conditions

Photo credit: J. Jeworrek

Learn more!

WRF initialized by the NAEFS ensemble mean

by R. Steinhart, G. West, and R. Stull

This research approaches numerical weather prediction (NWP) optimization from angle of initial and boundary condition sensitivity. Specifically, we are looking into whether an ensemble average of Initial Conditions (ICs), from the North American Ensemble Forecast System (NAEFS), can be used to initialize the Weather Research and Forecasting Model (WRF). The ensemble-average forecast is often the best when compared with deterministic output, however, the resulting combination of wind, temperature, and pressure is sometimes unphysical (namely, the fields do not agree with each other). For this reason, it is often assumed that it is not advisable to use ensemble-average forecasts as initial conditions (ICs). We are challenging this assumption within our research and working to prove this hypothesis that it is untrue.

Improving the weather forecast initial conditions

Photo credit: J. Jeworrek

Streamflow & run-of-river forecasts for hydroelectric generation

Streamflow & run-of-river forecasts for hydroelectric generation

John Hart Dam. Photo credit: BC Hydro

Learn more!

Utility of weather forecasts for run-of-river hydroelectric systems

by A. Kadel, D. McCollor, W. Antweiler, and R. Stull

Hydropower systems generate electricity using the kinetic energy of water flowing into the turbines, which are connected to generators that generate electricity. They are broadly divided into reservoir-based (storage) or run-of-river. The diversion method used to divert the water into the turbines is the main difference between these. Storage systems have a large dam act as a barrier on a river. This creates a large lake through which the water is diverted. In contrast, run-of-river systems have a smaller barricade called weir. The weir acts as a diversion only, without creating a large pond of water. Hence, run-of-rover are intermittent, with the power output depending on stream inflow.

Streamflow prediction using weather forecasts is widely used for operation of storage hydro systems. Since traditional run-of-river projects do not have additional flow control capabilities other than the weir, future inflow information seems redundant. Accordingly, there has been inadequate study in this topic. Our research explores the several water-related operating constraints that a run-of-river hydroelectric operator could experience and explore how streamflow forecasts could be used in such scenarios. Example scenarios where streamflow forecasts could add value include:

  • Potential damage due to flooding during construction phase of project
  • Forecasting of future energy yield for wholesale markets that rely on day ahead bidding.

The study area is in Nepal, where more than 90% of domestic electricity production is through run-of-river hydro systems.

Streamflow & run-of-river forecasts for hydroelectric generation

John Hart Dam. Photo credit: BC Hydro

Verification to measure the quality of weather forecasts and optimize model setup


Verification to measure the quality of weather forecasts and optimize model setup

Photo credit: C. Rodell

Learn more!

Precipitation forecast performance over complex terrain

by J. Jeworrek, G. West, and R. Stull

Numerical weather prediction models rely on several physics packages. Most of them approximate processes that are relatively small compared to the numerical grid size, at which the model resolves flow dynamics. The ideal combination of these physics parameterizations can vary for different weather characteristics. For example, in southwest British Columbia, precipitation is strongly influenced by seasons and mountains. Analysing a full year of modeling data from over 100 WRF configurations with systematically varied physics packages revealed sensitivities of precipitation forecast performance on rainfall intensity, season, location, grid resolution, and accumulation window. We identified different model setups that performed best in the summer dry season and the cool wet season. These results informed our choice to update some of our operational model ensemble members, which have shown outstanding performance also for our seasonal wind forecast verification.


Verification to measure the quality of weather forecasts and optimize model setup

Photo credit: C. Rodell

“Cleaning” precipitation gauge data with automated algorithms

“Cleaning” precipitation gauge data with automated algorithms

Photo credit: J. Jeworrek

Learn more!

Deep-learning-based precipitation observation quality control

by Yingkai Sha, D.J. Gagne, G. West, and R. Stull

In British Columbia, gauge networks are deployed in several key regions; they provide measurements of rainfall and play important roles in making operational flood warnings.

Although useful in general, rain gauges can be unreliable: their measurement quality is impacted by the harsh mountain weather, structural damage, and technical issues. Before using rain gauge measurements as “truth”, their poor quality values should be identified and removed.

Well-maintained gauge networks, for example, the BC Hydro network, has manual quality control routines. This ensures the accuracy of rainfall measurements, however, requires a high amount of human power – from 2016 to 2018, roughly 30% of hourly rain gauge measurements were judged as poor quality by human experts.

For providing accurate and timely rainfall estimates, we are collaborating with BC Hydro on developing automated quality control algorithms. The algorithm is experimental, and we have made progress on it. Currently, our algorithm can be implemented on a near-real-time basis, and its performance is 85% relative to a human expert.

“Cleaning” precipitation gauge data with automated algorithms

Photo credit: J. Jeworrek

Creating high-resolution surface weather fields over mountains

Creating high-resolution surface weather fields over mountains

Photo credit: J. Jeworrek

Learn more!

Deep-learning-based gridded downscaling of surface meteorological variables in complex terrain

by Yingkai Sha, D.J. Gagne, G. West, and R. Stull

Regional weather forecasts are produced on grid points that represent averaged conditions of a fixed area. Public-oriented forecasts are typically created on coarse grid points; their usage is limited in British Columbia because they cannot resolve detailed weather patterns in the mountainous terrain.

For solving this problem, we are actively developing downscaling methods that can take coarse forecast as inputs, and produce high-resolution surface weather fields based on terrain and climatology backgrounds. Our downscaling methods are based on convolutional neural networks with transfer learning abilities. They can be trained in regions where high-quality data is available, and then applied in other surrounded areas. Currently, the downscaling work is at laboratory levels. In the near future, it will be combined into our forecast post-processing routines, refining spatial details of forecasts in drainage basins.

Creating high-resolution surface weather fields over mountains

Photo credit: J. Jeworrek

Modelling snow surface temperature for ski racing - a Whistler ski resort case study

Modelling snow surface temperature for ski racing - a Whistler ski resort case study
Learn more!

by R. Howard and R. Stull

Accurately calculating snow-surface temperature and liquid-water content for a groomed and compacted ski run, known as a ski piste, is crucial to the preparation of fast skis for alpine racing. This research focuses on modelling the above variables for a clear-sky intensive observation period in February 2010. An automated weather station collected relevant meteorological data at a point on a ski piste in Whistler, BC, Canada, known as RC Whistler. The surface radiation budget is fundamental to this problem, and is affected by tall trees dominating the local horizon. Longwave radiation contributions from trees and sky were weighted by their view factors, and the total downwelling longwave radiation at the snow surface is modelled for RC Whistler, under clear skies.

A new one-dimensional numerical Lagrangian snowpack model has been written, solving the heat-, liquid-water-, and ice-budget equations to calculate the snow-surface temperature. Meteorological measurements from the clear-sky intensive observation period are prescribed as boundary conditions. Model components and parameters are validated and chosen with idealized model runs. Human factors were also considered, such as frequent skiers compacting the snowpack, and grooming snowcats that mix the top layer of the snowpack and work to increase the snow density and hardness, usually once daily. These effects are simulated in the numerical model. The model successfully simulates snow-surface temperature for the RC Whistler clear-sky intensive observation period.

Modelling snow surface temperature for ski racing - a Whistler ski resort case study