**Author: Kyle Hall (kjhall@iri.columbia.edu) - International Research Institute for Climate & Society**

**Author: Nachiketa Acharya (npa5302@psu.edu) - Center for Earth System Modeling, Analysis, & Data (ESMAD), The Pennsylvania State University**

#### POELM Overview

Multi-Model Ensembles (MMEs) are well known to produce more skillful forecasts than those produced by single GCMs. Here, we use the Probabilistic Output Extreme Learning Machine (POELM) to produce MME forecasts for the WMO S2S AI Prediction Challenge.

Extreme Learning Machine is a fast form of Single-Layer Feed-Forward Neural Network. ELM's hidden layer weights and biases are randomly initialized, and then left unchanged. As opposed to a traditional neural network, which would fit hidden layer neurons with to a time-consuming backpropagation algorithm like stochastic gradient descent, ELM fits only its output layer, using a generalized Moore-Penrose Inverse.

...

...

@@ -17,6 +20,9 @@ All of this our code uses the open source [XCast](https://github.com/kjhall01/xc

### S2S POELM MME Forecasts

Our S2S AI Competition submission uses POELM to produce probabilistic forecasts for the Week 3-4 and Week 5-6 target periods.

**Approach**

Here, we use an ensemble mean of 5 trained POELM models, each with 5 randomly initialized sigmoid hidden neurons. With Hyperparameter Tuning, the POELM approach can yield even more impressive results.

**Predictors**

We use three deterministic GCM outputs provided by the S2S AI competition- ECCC, ECMWF and NCEP-CFSv2. Our model is trained on data from the period 2000-2010, since thats the only period where data is available for all models.

To address the fact that the models are not initialized on the same days, we select the dates required for the competition, and then select the closest initialization from each model from before each date. Then we adjust the lead times selected accordingly to get the correct target period. The predictors are then scaled to [-1, 1] using MinMax Scaling.