35 results on '"Taha B. M. J. Ouarda"'
Search Results
2. Multivariate Nonstationary Oscillation Simulation of Climate Indices With Empirical Mode Decomposition
- Author
-
Taesam Lee and Taha B. M. J. Ouarda
- Subjects
Multivariate statistics ,Arctic oscillation ,Stochastic modelling ,Climatology ,Resampling ,Stochastic simulation ,Autoregressive–moving-average model ,Hilbert–Huang transform ,Pacific decadal oscillation ,Water Science and Technology ,Mathematics - Abstract
The objective of the current study is to build a stochastic model to simulate climate indices that are teleconnected with the hydrologic regimes of large‐scale water resources systems such as the Great Lakes system. Climate indices generally contain nonstationary oscillations (NSOs). We adopted a stochastic simulation model based on Empirical Mode Decomposition (EMD). The procedure for the model is to decompose the observed series and then to simulate the decomposed components with the NSO resampling (NSOR) technique. Because the model has only been previously applied to single variables, a multivariate version of NSOR (M‐NSOR) is developed to consider the links between the climate indices and to reproduce the NSO process. The proposed M‐NSOR model is tested in a simulation study on the Rossler system. The simulation results indicate that the M‐NSOR model reproduces the significant oscillatory behaviors of the system and the marginal statistical characteristics. Subsequently, the M‐NSOR model is applied to three climate indices (i.e., Arctic Oscillation, El Nino‐Southern Oscillation, and Pacific Decadal Oscillation) for the annual and winter data sets. The results of the proposed model are compared to those of the Contemporaneous Shifting Mean and Contemporaneous Autoregressive Moving Average model. The results indicate that the proposed M‐NSOR model is superior to the Contemporaneous Shifting Mean and Contemporaneous Autoregressive Moving Average model for reproducing the NSO process, while the other basic statistics are comparatively well preserved in both cases. The current study concludes that the proposed M‐NSOR model can be a good alternative to simulate NSO processes and their teleconnections with climate indices.
- Published
- 2019
- Full Text
- View/download PDF
3. Fully nonlinear statistical and machine‐learning approaches for hydrological frequency estimation at ungauged sites
- Author
-
Dhouha Ouali, Taha B. M. J. Ouarda, and Fateh Chebana
- Subjects
Global and Planetary Change ,Mathematical optimization ,Artificial neural network ,Computer science ,Hydrological modelling ,0208 environmental biotechnology ,Generalized additive model ,Linear model ,02 engineering and technology ,020801 environmental engineering ,Nonlinear system ,13. Climate action ,Linear regression ,General Earth and Planetary Sciences ,Environmental Chemistry ,Linear combination ,Canonical correlation - Abstract
The high complexity of hydrological systems has long been recognized. Despite the increasing number of statistical techniques that aim to estimate hydrological quantiles at ungauged sites, few approaches were designed to account for the possible nonlinear connections between hydrological variables and catchments characteristics. Recently, a number of nonlinear machine-learning tools have received attention in regional frequency analysis (RFA) applications especially for estimation purposes. In this paper, the aim is to study nonlinearity-related aspects in the RFA of hydrological variables using statistical and machine-learning approaches. To this end, a variety of combinations of linear and nonlinear approaches are considered in the main RFA steps (delineation and estimation). Artificial neural networks (ANNs) and generalized additive models (GAMs) are combined to a nonlinear ANN-based canonical correlation analysis (NLCCA) procedure to ensure an appropriate nonlinear modeling of the complex processes involved. A comparison is carried out between classical linear combinations (CCAs combined with linear regression (LR) model), semilinear combinations (e.g., NLCCA with LR) and fully nonlinear combinations (e.g., NLCCA with GAM). The considered models are applied to three different data sets located in North America. Results indicate that fully nonlinear models (in both RFA steps) are the most appropriate since they provide best performances and a more realistic description of the physical processes involved, even though they are relatively more complex than linear ones. On the other hand, semilinear models which consider nonlinearity either in the delineation or estimation steps showed little improvement over linear models. The linear approaches provided the lowest performances.
- Published
- 2017
- Full Text
- View/download PDF
4. Urban climate modifications in hot desert cities: The role of land cover, local climate, and seasonality
- Author
-
Hosni Ghedira, Annalisa Molini, Prashanth Reddy Marpu, Taha B. M. J. Ouarda, and Michele Lazzarini
- Subjects
Geophysics ,Urban climate ,Climatology ,Temperate climate ,Impervious surface ,General Earth and Planetary Sciences ,Environmental science ,Vegetation ,Land cover ,Urban heat island ,Arid ,Normalized Difference Vegetation Index - Abstract
Urban climate modifications like the urban heat island (UHI) have been extensively investigated in temperate regions. In contrast, the understanding of how urbanization relates to climate in hot, hyperarid environments is still extremely limited, despite the growing socioeconomic relevance of arid lands and their fast urbanization rate. We explore here the relationship between land cover and temperature regime in hot desert cities (HDCs) based on estimates of land surface temperature, normalized difference vegetation index, and impervious surface areas inferred from Moderate Resolution Imaging Spectroradiometer and Landsat satellite products. Our analysis shows that HDCs display common climatic patterns, with downtown areas on average cooler than suburbs during the daytime (urban cool island) and warmer at night (classical UHI). The observed diurnal cool island effect can be largely explained by relative vegetation abundance, percentage of bare soil, and local climatic conditions and calls for a more in deep investigation of the physical processes regulating boundary layer dynamics in arid regions.
- Published
- 2015
- Full Text
- View/download PDF
5. Modeling the relationship between climate oscillations and drought by a multivariate GARCH model
- Author
-
Reza Modarres and Taha B. M. J. Ouarda
- Subjects
Heteroscedasticity ,Multivariate statistics ,Autoregressive model ,Autoregressive conditional heteroskedasticity ,Statistics ,Econometrics ,Univariate ,Contrast (statistics) ,Covariance ,Conditional variance ,Water Science and Technology ,Mathematics - Abstract
Typical multivariate time series models may exhibit comovement in mean but not in variance of hydrologic and climatic variables. This paper introduces multivariate generalized autoregressive conditional heteroscedasticity (GARCH) models to capture the comovement of the variance or the conditional covariance between two hydroclimatic time series. The diagonal vectorized and Baba-Engle-Kroft-Kroner models are developed to evaluate the covariance between drought and two atmospheric circulations, Southern Oscillation Index (SOI) and North Atlantic Oscillation (NAO) time series during 1954-2000. The univariate generalized autoregressive conditional heteroscedasticity model indicates a strong persistency level in conditional variance for NAO and a moderate persistency level for SOI. The conditional variance of short-term drought index indicates low level of persistency, while the long-term index drought indicates high level of persistency in conditional variance. The estimated conditional covariance between drought and atmospheric indices is shown to be weak and negative. It is also observed that the covariance between drought and atmospheric indices is largely dependent on short-run variance of atmospheric indices rather than their long-run variance. The nonlinearity and stationarity tests show that the conditional covariances are nonlinear but stationary. However, the degree of nonlinearity is higher for the covariance between long-term drought and atmospheric indices. It is also observed that the nonlinearity of NAO is higher than that for SOI, in contrast to the stationarity which is stronger for SOI time series. Key Points Multivariate heteroscedastic models are developed for drought analysis Conditional covariance between drought, SOI, and NAO is not strong Time-varying correlations between drought and atmospheric indices are estimated.
- Published
- 2014
- Full Text
- View/download PDF
6. Depth-based regional index-flood model
- Author
-
Hussein Wazneh, Taha B. M. J. Ouarda, and Fateh Chebana
- Subjects
Data set ,Weight function ,Identification (information) ,Basis (linear algebra) ,Cross-correlation ,Similarity (network science) ,Statistics ,Performance improvement ,Water Science and Technology ,Weighting ,Mathematics - Abstract
Regional flood frequency analysis aims to estimate flood risk at sites where little or no hydrological data are available. The index-flood model is one of the commonly employed models for this purpose. In this model, the predicted value depends on the growth curve and its regional parameters. The latter are estimated as weighted averages of the at-site parameters. Traditional approaches are mainly based on site record lengths or region size to define these weights. Hence, they are not representative of the hydrological similarity between sites within a region. In addition, they are not defined to reach optimality in terms of model performance. To overcome these limitations, the present paper aims to propose a new optimal iterative weighting scheme to the index-flood model. The proposed approach is based on a number of elements: a statistical depth function to introduce similarity between sites, a weight function to amplify and control the depth values, an iterative procedure to improve estimation accuracy, and an optimization algorithm to objectively automate the choice of the weight function. A data set from the Island of Sicily (Italy) is used to compare the proposed approach with traditional ones. On the basis of the L-moments and using cluster analysis techniques, the studied region is subdivided into three homogeneous subregions. The results indicate that the proposed approach performs significantly better than traditional ones both in terms of relative bias and relative root mean squares error. The proposed approach allows identification of cross correlation in the region and provides a significant performance improvement.
- Published
- 2013
- Full Text
- View/download PDF
7. Predictor selection for downscaling GCM data with LASSO
- Author
-
Taesam Lee, Jonghyun Lee, Dorra Hammami, and Taha B. M. J. Ouarda
- Subjects
Atmospheric Science ,Mean squared error ,Computer science ,Soil Science ,Aquatic Science ,Oceanography ,computer.software_genre ,Lasso (statistics) ,Geochemistry and Petrology ,Statistics ,Earth and Planetary Sciences (miscellaneous) ,Selection (genetic algorithm) ,Earth-Surface Processes ,Water Science and Technology ,Ecology ,Paleontology ,Forestry ,Collinearity ,Regression ,Data set ,Geophysics ,Space and Planetary Science ,Climate model ,Data mining ,computer ,Downscaling - Abstract
[1] Over the last 10 years, downscaling techniques, including both dynamical (i.e., the regional climate model) and statistical methods, have been widely developed to provide climate change information at a finer resolution than that provided by global climate models (GCMs). Because one of the major aims of downscaling techniques is to provide the most accurate information possible, data analysts have tried a number of approaches to improve predictor selection, which is one of the most important steps in downscaling techniques. Classical methods such as regression techniques, particularly stepwise regression (SWR), have been employed for downscaling. However, SWR presents some limits, such as deficiencies in dealing with collinearity problems, while also providing overly complex models. Thus, the least absolute shrinkage and selection operator (LASSO) technique, which is a penalized regression method, is presented as another alternative for predictor selection in downscaling GCM data. It may allow for more accurate and clear models that can properly deal with collinearity problems. Therefore, the objective of the current study is to compare the performances of a classical regression method (SWR) and the LASSO technique for predictor selection. A data set from 9 stations located in the southern region of Quebec that includes 25 predictors measured over 29 years (from 1961 to 1990) is employed. The results indicate that, due to its computational advantages and its ease of implementation, the LASSO technique performs better than SWR and gives better results according to the determination coefficient and the RMSE as parameters for comparison.
- Published
- 2012
- Full Text
- View/download PDF
8. Exploratory functional flood frequency analysis and outlier detection
- Author
-
Sophie Dabo-Niang, Taha B. M. J. Ouarda, and Fateh Chebana
- Subjects
Multivariate statistics ,010504 meteorology & atmospheric sciences ,Flood myth ,Computer science ,business.industry ,0207 environmental engineering ,Univariate ,Functional data analysis ,Hydrograph ,Context (language use) ,02 engineering and technology ,computer.software_genre ,01 natural sciences ,6. Clean water ,Data visualization ,13. Climate action ,Statistics ,Outlier ,Data mining ,020701 environmental engineering ,business ,computer ,0105 earth and related environmental sciences ,Water Science and Technology - Abstract
[1] The prevention of flood risks and the effective planning and management of water resources require river flows to be continuously measured and analyzed at a number of stations. For a given station, a hydrograph can be obtained as a graphical representation of the temporal variation of flow over a period of time. The information provided by the hydrograph is essential to determine the severity of extreme events and their frequencies. A flood hydrograph is commonly characterized by its peak, volume, and duration. Traditional hydrological frequency analysis (FA) approaches focused separately on each of these features in a univariate context. Recent multivariate approaches considered these features jointly in order to take into account their dependence structure. However, all these approaches are based on the analysis of a number of characteristics and do not make use of the full information content of the hydrograph. The objective of the present work is to propose a new framework for FA using the hydrographs as curves: functional data. In this context, the whole hydrograph is considered as one infinite-dimensional observation. This context allows us to provide more effective and efficient estimates of the risk associated with extreme events. The proposed approach contributes to addressing the problem of lack of data commonly encountered in hydrology by fully employing all the information contained in the hydrographs. A number of functional data analysis tools are introduced and adapted to flood FA with a focus on exploratory analysis as a first stage toward a complete functional flood FA. These methods, including data visualization, location and scale measures, principal component analysis, and outlier detection, are illustrated in a real-world flood analysis case study from the province of Quebec, Canada.
- Published
- 2012
- Full Text
- View/download PDF
9. Improved methods for daily streamflow estimates at ungauged sites
- Author
-
Chang Shu and Taha B. M. J. Ouarda
- Subjects
Mean squared error ,Logarithm ,Geographical distance ,Streamflow ,Statistics ,Jackknife resampling ,Regression ,Water Science and Technology ,Mathematics ,Multivariate interpolation ,Interpolation - Abstract
[1] In this paper, improved flow duration curve (FDC) and area ratio (AR) based methods are developed to obtain better daily streamflow estimation at ungauged sites. A regression based logarithmic interpolation method which makes no assumption on the distribution or shape of a FDC is introduced in this paper to estimate regional FDCs. The estimated FDC is combined with a spatial interpolation algorithm to obtain daily streamflow estimates. Multiple source sites based AR methods, especially the geographical distance weighted AR (GWAR) method, are introduced in an effort to maximize the use of regional information and improve the standard AR method (SAR). Performances of the proposed approaches are evaluated using a jackknife procedure. The application to 109 stations in the province of Quebec, Canada indicates that the FDC based methods outperform AR based methods in all the summary statistics including Nash, root mean squared error (RMSE), and Bias. The number of sites that show better performances using the FDC based approaches is also significantly larger than the number of sites showing better performances using AR based methods. Using geographical distance weighted multiple sources sites based approaches can improve the performance at the majority of the catchments comparing with using the single source site based approaches.
- Published
- 2012
- Full Text
- View/download PDF
10. Stochastic simulation of nonstationary oscillation hydroclimatic processes using empirical mode decomposition
- Author
-
Taesam Lee and Taha B. M. J. Ouarda
- Subjects
Stochastic modelling ,Resampling ,Stochastic simulation ,Econometrics ,Nonparametric statistics ,Trigonometric functions ,Applied mathematics ,Time series ,Hilbert–Huang transform ,Bootstrapping (statistics) ,Water Science and Technology ,Mathematics - Abstract
[1] Nonstationary oscillation (NSO) processes are observed in a number of hydroclimatic data series. Stochastic simulation models are useful to study the impacts of the climatic variations induced by NSO processes into hydroclimatic regimes. Reproducing NSO processes in a stochastic time series model is, however, a difficult task because of the complexity of the nonstationary behaviors. In the current study, a novel stochastic simulation technique that reproduces the NSO processes embedded in hydroclimatic data series is presented. The proposed model reproduces NSO processes by utilizing empirical mode decomposition (EMD) and nonparametric simulation techniques (i.e., k-nearest-neighbor resampling and block bootstrapping). The model was first tested with synthetic data sets from trigonometric functions and the Rossler system. The North Atlantic Oscillation (NAO) index was then examined as a real case study. This NAO index was then employed as an exogenous variable for the stochastic simulation of streamflows at the Romaine River in the province of Quebec, Canada. The results of the application to the synthetic data sets and the real-world case studies indicate that the proposed model preserves well the NSO processes along with the key statistical characteristics of the observations. It was concluded that the proposed model possesses a reasonable simulation capacity and a high potential as a stochastic model, especially for hydroclimatic data sets that embed NSO processes.
- Published
- 2012
- Full Text
- View/download PDF
11. Spring flood reconstruction from continuous and discrete tree ring series
- Author
-
Étienne Boucher, Yves Bégin, Taha B. M. J. Ouarda, and Antoine Nicault
- Subjects
Variable (computer science) ,Tree (data structure) ,Flood myth ,Calibration (statistics) ,Climatology ,Streamflow ,Generalized additive model ,Dendrochronology ,Jackknife resampling ,Geology ,Water Science and Technology - Abstract
[1] This study proposes a method to reconstruct past spring flood discharge from continuous and discrete tree ring chronologies, since both have their respective strengths and weaknesses in northern environments. Ring width or density series provide uninterrupted records that are indirectly linked to regional discharge through a concomitant effect of climate on tree growth and streamflow. Conversely, discrete event chronologies constitute conspicuous records of past high water levels since they are constructed from trees that are directly damaged by the flood. However, the uncertainty of discrete series increases toward the past, and their relationships with spring discharge are often nonlinear. To take advantage of these two sources of information, we introduce a new transfer model technique on the basis of generalized additive model (GAM) theory. The incorporation of discrete predictors and the evaluation of the robustness of the nonlinear relationships are assessed using a jackknife procedure. We exemplify our approach in a reconstruction of May water supplies to the Caniapiscau hydroelectric reservoir in northern Quebec, Canada. We used earlywood density measurements as continuous variables and ice-scar dates around Lake Montausier in the James Bay area as a discrete variable. Strong calibration (0.57 < 0.61 < 0.75) and validation (0.27 < 0.44 < 0.58) R2 statistics were obtained, thus highlighting the usefulness of the model. Our reconstruction suggests that, since ∼1965, spring floods have become more intense and variable in comparison with the last 150 years. We argue that a similar procedure can be used in each case where discrete and continuous tree ring proxies are used together to reconstruct past spring floods.
- Published
- 2011
- Full Text
- View/download PDF
12. Depth-based multivariate descriptive statistics with hydrological applications
- Author
-
Taha B. M. J. Ouarda and Fateh Chebana
- Subjects
Atmospheric Science ,Multivariate statistics ,Multivariate analysis ,Ecology ,Descriptive statistics ,Computer science ,Paleontology ,Soil Science ,Forestry ,Aquatic Science ,Oceanography ,Geophysics ,Bivariate data ,Space and Planetary Science ,Geochemistry and Petrology ,Skewness ,Outlier ,Statistics ,Earth and Planetary Sciences (miscellaneous) ,Kurtosis ,Anomaly detection ,Earth-Surface Processes ,Water Science and Technology - Abstract
[1] Hydrological events are often described through various characteristics which are generally correlated. To be realistic, these characteristics are required to be considered jointly. In multivariate hydrological frequency analysis, the focus has been made on modeling multivariate samples using copulas. However, prior to this step, data should be visualized and analyzed in a descriptive manner. This preliminary step is essential for all of the remaining analysis. It allows us to obtain information concerning the location, scale, skewness, and kurtosis of the sample as well as outlier detection. These features are useful to exclude some unusual data, to make different comparisons, and to guide the selection of the appropriate model. In the present paper we introduce methods measuring these features, which are mainly based on the notion of depth function. The application of these techniques is illustrated on two real-world streamflow data sets from Canada. In the Ashuapmushuan case study, there are no outliers and the bivariate data are likely to be elliptically symmetric and heavy-tailed. The Magpie case study contains a number of outliers, which are identified to be real observed data. These observations cannot be removed and should be accommodated by considering robust methods for further analysis. The presented depth-based techniques can be adapted to a variety of hydrological variables.
- Published
- 2011
- Full Text
- View/download PDF
13. Prediction of climate nonstationary oscillation processes with empirical mode decomposition
- Author
-
Taha B. M. J. Ouarda and Taesam Lee
- Subjects
Rössler attractor ,Atmospheric Science ,Ecology ,Stochastic modelling ,Mode (statistics) ,Paleontology ,Soil Science ,Forestry ,Aquatic Science ,Oceanography ,Synthetic data ,Hilbert–Huang transform ,Geophysics ,Space and Planetary Science ,Geochemistry and Petrology ,Resampling ,Attractor ,Statistics ,Earth and Planetary Sciences (miscellaneous) ,Applied mathematics ,Decomposition method (constraint satisfaction) ,Earth-Surface Processes ,Water Science and Technology ,Mathematics - Abstract
[1] Long-term nonstationary oscillations (NSOs) are commonly observed in climatological data series such as global surface temperature anomalies (GSTA) and low-frequency climate oscillation indices. In this work, we present a stochastic model that captures NSOs within a given variable. The model employs a data-adaptive decomposition method named empirical mode decomposition (EMD). Irregular oscillatory processes in a given variable can be extracted into a finite number of intrinsic mode functions with the EMD approach. A unique data-adaptive algorithm is proposed in the present paper in order to study the future evolution of the NSO components extracted from EMD. To evaluate the model performance, the model is tested with the synthetic data set from Rossler attractor and with GSTA data. The results of the attractor show that the proposed approach provides a good characterization of the NSOs. For GSTA data, the last 30 observations are truncated and compared to the generated data. Then the model is used to predict the evolution of GSTA data over the next 50 years. The results of the case study confirm the power of the EMD approach and the proposed NSO resampling (NSOR) method as well as their potential for the study of climate variables.
- Published
- 2011
- Full Text
- View/download PDF
14. Precipitation variability over UAE and global SST teleconnections
- Author
-
Taha B. M. J. Ouarda and K. Niranjan Kumar
- Subjects
Atmospheric Science ,Rain gauge ,Ocean current ,Rossby wave ,Jet stream ,Atmospheric sciences ,Latitude ,Geophysics ,Space and Planetary Science ,Anticyclone ,Climatology ,Earth and Planetary Sciences (miscellaneous) ,Environmental science ,Precipitation ,Teleconnection - Abstract
The present study investigates the role of equatorial Pacific sea surface temperatures (SSTs) on the precipitation variability over the United Arab Emirates (UAE) and adjoining Middle East regions. Monthly precipitation data (1981–2011) assembled from rain gauge stations located in the UAE along with other global reanalysis data sets are used to explore the teleconnections. It is observed that statistically significant correlations exist between precipitation over the UAE and the equatorial Pacific and North Atlantic SSTs. Canonical correlation analysis between the monthly winter precipitation and the global SSTs (60°S to 60°N) reveals that the major portion of the precipitation variability is influenced by equatorial Pacific SSTs associated with El Nino–Southern Oscillation (ENSO). The moisture budget analysis reveals the distinct change in the anomalous circulation (cyclonic and anticyclonic) associated with strong convergence and divergence of the moisture flux during the warm and cold phases of ENSO, respectively. Further, the composite analysis of upper troposheric zonal wind shows the equatorward shift (~2° latitude) of subtropical jet stream (STJ) over the Middle East during the warm phase of ENSO affecting the weather in the UAE. The findings suggest that the teleconnection linking ENSO and the precipitation over UAE and adjoining regions is mediated by the response of STJ to Rossby waves.
- Published
- 2014
- Full Text
- View/download PDF
15. Long-term prediction of precipitation and hydrologic extremes with nonstationary oscillation processes
- Author
-
Taesam Lee and Taha B. M. J. Ouarda
- Subjects
Atmospheric Science ,Ecology ,Meteorology ,Series (mathematics) ,Oscillation ,Paleontology ,Soil Science ,Forestry ,Aquatic Science ,Oceanography ,Hilbert–Huang transform ,Geophysics ,Space and Planetary Science ,Geochemistry and Petrology ,Streamflow ,Earth and Planetary Sciences (miscellaneous) ,Environmental science ,Hydrometeorology ,Precipitation ,Long-term prediction ,Extreme value theory ,Earth-Surface Processes ,Water Science and Technology - Abstract
[1] Nonstationary oscillations in climatic variables and indices have been the focus of many studies. Since climate indices or their associated hydrometeorological variables might contain nonstationary oscillation processes, it would be useful to be able to divide the intrinsic nonstationary oscillation into a finite number of components. Those components can then be used to predict the future system evolution. In the current study nonstationary oscillations of certain time series are extracted using a decomposition analysis called the empirical mode decomposition (EMD). In EMD the most important components are modeled with a nonstationary oscillation resampling (NSOR) technique. To predict a long-term oscillation pattern, a time series with a long record is required. The normalized regional precipitation of eastern Canada is one such series. In a second example, the future evolution of extreme streamflows at two stations from the province of Quebec, Canada, is studied by using the long-term patterns of climatic indices. Results indicate that the future long-term patterns are well-modeled with the NSOR and EMD. However, the indirect approach to finding the interconnection sometimes gives rise to a high prediction uncertainty.
- Published
- 2010
- Full Text
- View/download PDF
16. Regional low-flow frequency analysis using single and ensemble artificial neural networks
- Author
-
Taha B. M. J. Ouarda and C. Shu
- Subjects
Statistics::Theory ,Artificial neural network ,Generalization ,business.industry ,Computer science ,Bootstrap aggregating ,Computer Science::Neural and Evolutionary Computation ,Regression analysis ,Machine learning ,computer.software_genre ,Multilayer perceptron ,Artificial intelligence ,Data mining ,business ,Jackknife resampling ,computer ,Water Science and Technology ,Parametric statistics ,Quantile - Abstract
[1] In this paper, artificial neural networks (ANNs) are introduced to obtain improved regional low-flow estimates at ungauged sites. A multilayer perceptron (MLP) network is used to identify the functional relationship between low-flow quantiles and the physiographic variables. Each ANN is trained using the Levenberg-Marquardt algorithm. To improve the generalization ability of a single ANN, several ANNs trained for the same task are used as an ensemble. The bootstrap aggregation (or bagging) approach is used to generate individual networks in the ensemble. The stacked generalization (or stacking) technique is adopted to combine the member networks of an ANN ensemble. The proposed approaches are applied to selected catchments in the province of Quebec, Canada, to obtain estimates for several representative low-flow quantiles of summer and winter seasons. The jackknife validation procedure is used to evaluate the performance of the proposed models. The ANN-based approaches are compared with the traditional parametric regression models. The results indicate that both the single and ensemble ANN models provide superior estimates than the traditional regression models. The ANN ensemble approaches provide better generalization ability than the single ANN models.
- Published
- 2009
- Full Text
- View/download PDF
17. Index flood-based multivariate regional frequency analysis
- Author
-
Fateh Chebana and Taha B. M. J. Ouarda
- Subjects
Multivariate statistics ,Frequency analysis ,Flood myth ,law ,Homogeneity (statistics) ,Statistics ,Econometrics ,Univariate ,Bivariate analysis ,Water Science and Technology ,law.invention ,Quantile ,Mathematics - Abstract
Because of their multivariate nature, several hydrological phenomena can be described by more than one correlated characteristic. These characteristics are generally not independent and should be jointly considered. Consequently, univariate regional frequency analysis (FA) cannot provide complete assessment of true probabilities of occurrence. The objective of the present paper is to propose a procedure for regional flood FA in a multivariate framework. In the present paper, the focus is on the estimation step of regional FA. The proposed procedure represents a multivariate version of the index flood model and is based on copulas and a multivariate quantile version with a focus on the bivariate case. The model offers increased flexibility to designers by leading to several scenarios associated with the same risk. The univariate quantiles represent special cases corresponding to the extreme scenarios. A simulation study is carried out to evaluate the performance of the model in a bivariate framework. Simulation results show that bivariate FA provides the univariate quantiles with equivalent accuracy. Similarity is observed between results of the bivariate model and those of the univariate one in terms of the behavior of the corresponding performance criteria. The procedure performs better when the regional homogeneity is high. Furthermore, the impacts of small variations in the record length at gauged sites and the region size on the performance of the proposed procedure are not significant.
- Published
- 2009
- Full Text
- View/download PDF
18. Intercomparison of homogenization techniques for precipitation data continued: Comparison of two recent Bayesian change point models
- Author
-
Xuebin Zhang, Ousmane Seidou, Claudie Beaulieu, and Taha B. M. J. Ouarda
- Subjects
Computer science ,Homogeneous ,Simulated data ,Bayesian probability ,Linear regression ,Homogenization (climate) ,Climate change ,Time point ,Algorithm ,Change detection ,Water Science and Technology ,Remote sensing - Abstract
In this paper, two new Bayesian change point techniques are described and compared to eight other techniques presented in previous work to detect inhomogeneities in climatic series. An inhomogeneity can be defined as a change point (a time point in a series such that the observations have a different distribution before and after this time) in the data series induced from changes in measurement conditions at a given station. It is important to be able to detect and correct an inhomogeneity, as it can interfere with the real climate change signal. The first technique is a Bayesian method of multiple change point detection in a multiple linear regression. The second one allows the detection of a single change point in a multiple linear regression. These two techniques have never been used for homogenization purposes. The ability of the two techniques to discriminate homogeneous and inhomogeneous series was evaluated using simulated data series. Various sets of synthetic series (homogeneous, with a single shift, and with multiple shifts) representing the typical total annual precipitation observed in the southern and central parts of the province of Quebec, Canada, and nearby areas were generated for the purpose of this study. The two techniques gave small false detection rates on the homogeneous series. Furthermore, the two techniques proved to be efficient for the detection of a single shift in a series. For the series with multiple shifts, the Bayesian method of multiple change point detection performed better. An application to a real data set is also provided and validated with the available metadata.
- Published
- 2009
- Full Text
- View/download PDF
19. Joint Bayesian model selection and parameter estimation of the generalized extreme value model with covariates using birth-death Markov chain Monte Carlo
- Author
-
Taha B. M. J. Ouarda and Salaheddine El Adlouni
- Subjects
Mathematical optimization ,Bayesian probability ,Markov chain Monte Carlo ,Bayesian inference ,Variable-order Bayesian network ,Statistics::Computation ,Bayesian statistics ,symbols.namesake ,Generalized extreme value distribution ,symbols ,Applied mathematics ,Bayesian linear regression ,Water Science and Technology ,Mathematics ,Gibbs sampling - Abstract
[1] This paper describes Bayesian estimation of the parameters of the generalized extreme value (GEV) model with covariates. For this model the parameters of the GEV distribution are functions of covariates, allowing for dependent parameters and/or trends. A Markov chain Monte Carlo (MCMC) algorithm is generally used to estimate the posterior distributions of the parameters in a Bayesian framework. In this paper, the birth-death MCMC (BDMCMC) procedure is developed in order to carry out both parameter estimation and Bayesian model selection. The BDMCMC methods allow the jump between models of different dimensions. The general algorithm consists of two types of sampling steps. The first one involves dimension-changing moves, and the second is conditional on a fixed model. Parameters are estimated in a fully Bayesian framework, and the model is selected by the length of time that the MCMC chain remains in that model. Real and simulated data sets illustrate the usefulness of the proposed methodology.
- Published
- 2009
- Full Text
- View/download PDF
20. Modeling all exceedances above a threshold using an extremal dependence structure: Inferences on several flood characteristics
- Author
-
Eric Sauquet, Jean Michel Grésillon, Taha B. M. J. Ouarda, and Mathieu Ribatet
- Subjects
Structure (mathematical logic) ,Estimation ,Small data ,010504 meteorology & atmospheric sciences ,Series (mathematics) ,Flood myth ,Computer science ,Estimator ,01 natural sciences ,010104 statistics & probability ,13. Climate action ,Econometrics ,0101 mathematics ,Duration (project management) ,0105 earth and related environmental sciences ,Water Science and Technology ,Quantile - Abstract
Flood quantile estimation is of great importance for many engineering studies and policy decisions. However, practitioners must often deal with small data available. Thus, the information must be used optimally. In the last decades, to reduce the waste of data, inferential methodology has evolved from annual maxima modeling to peaks over a threshold one. To mitigate the lack of data, peaks over a threshold are sometimes combined with additional information - mostly regional and historical information. However, whatever the extra information is, the most precious information for the practitioner is found at the target site. In this study, a model that allows inferences on the whole time series is introduced. In particular, the proposed model takes into account the dependence between successive extreme observations using an appropriate extremal dependence structure. Results show that this model leads to more accurate flood peak quantile estimates than conventional estimators. In addition, as the time dependence is taken into account, inferences on other flood characteristics can be performed. An illustration is given on flood duration. Our analysis shows that the accuracy of the proposed models to estimate the flood duration is related to specific catchment characteristics. Some suggestions to increase the flood duration predictions are introduced.
- Published
- 2009
- Full Text
- View/download PDF
21. Depth and homogeneity in regional flood frequency analysis
- Author
-
Fateh Chebana and Taha B. M. J. Ouarda
- Subjects
Multivariate statistics ,Frequency analysis ,Weight function ,Flood frequency analysis ,law ,Homogeneous ,Homogeneity (statistics) ,Statistics ,Regression analysis ,Canonical correlation ,Water Science and Technology ,law.invention ,Mathematics - Abstract
[1] Regional frequency analysis (RFA) consists generally of two steps: (1) delineation of hydrological homogeneous regions and (2) regional estimation. Existing regionalization methods which adopt this two-step approach suffer from two principal drawbacks. First, the restriction of the regional estimation to a particular region by excluding some sites can correspond to a loss of some information. Second, the definition of a region generates a border effect problem. To overcome these problems, a new method is proposed in the present paper. The proposed method is based on three elements: (1) a weight function to treat the border effect problem, (b) a function to evaluate how “similar” each site is to the target one, and (c) an iterative procedure to improve estimation results. Element (b) is treated using the statistical notion of depth functions which is introduced to provide a ranking of stations in a multivariate context. Furthermore, the properties of depth functions meet the characteristics sought in RFA. It is shown that the proposed method is flexible and general and that traditional RFA methods represent special cases of the depth-based approach corresponding to particular weight functions. A comparison is carried out with the canonical correlation analysis (CCA) approach. Results indicate that the depth-based approach performs better than does CCA both in terms of relative bias and relative root mean squares error.
- Published
- 2008
- Full Text
- View/download PDF
22. Temporal evolution of low-flow regimes in Canadian rivers
- Author
-
Philippe Gachon, Laxmi Sushama, Taha B. M. J. Ouarda, and M. N. Khaliq
- Subjects
geography ,geography.geographical_feature_category ,Trend detection ,Climatology ,Flow (psychology) ,Nonparametric statistics ,Drainage basin ,Environmental science ,Climate change ,Structural basin ,Persistence (discontinuity) ,Field (geography) ,Water Science and Technology - Abstract
[1] This study investigates temporal evolution of 1-, 7-, 15-, and 30-day annual and seasonal low-flow regimes of pristine river basins, included in the Canadian reference hydrometric basin network (RHBN), for three time frames: 1974–2003, 1964–2003, and 1954–2003. For the analysis, the RHBN stations are classified into three categories, which correspond to stations where annual low flows occur in winter only, summer only, and both summer and winter seasons, respectively. Unlike in previous studies for the RHBN, such classification is essential to better understand and interpret the identified trends in low-flow regimes in the RHBN. Nonparametric trend detection and bootstrap resampling approaches are used for the assessment of at-site temporal trends under the assumption of no persistence or short-term persistence (STP). The results of the study demonstrate that previously suggested prewhitening and trend-free prewhitening approaches, for incorporating the effect of STP on trend significance, are not adequate for reliably identifying trends in low-flow regimes compared to a simple bootstrap-based approach. The analyses of 10 relatively longer records reveal that trends in low-flow regimes exhibit fluctuating behavior, and hence, their temporal and spatial interpretations appear to be sensitive to the time frame chosen for the analysis. Furthermore, under the assumption of long-term persistence (LTP), which is a possible explanation for the fluctuating behavior of trends, many of the significant trends in low-flow regimes, noted under the assumption of STP, become nonsignificant and their field significance also disappears. Therefore correct identification of STP or LTP in time series of low-flow regimes is very important as it has serious implications for the detection and interpretation of trends.
- Published
- 2008
- Full Text
- View/download PDF
23. Intercomparison of homogenization techniques for precipitation data
- Author
-
Claudie Beaulieu, Gilles Boulet, Ousmane Seidou, Xuebin Zhang, Abderrahmane Yagouti, and Taha B. M. J. Ouarda
- Subjects
Climatic data ,Homogeneous ,Homogeneity (statistics) ,Statistics ,Environmental science ,Sequential test ,Bivariate analysis ,Regression ,Water Science and Technology ,Statistical hypothesis testing - Abstract
This paper presents an intercomparison of eight statistical tests to detect inhomogeneities in climatic data. The objective was to select those that are more suitable for precipitation data in the southern and central regions of the province of Quebec, Canada. The performances of these methods were evaluated by simulation on several thousands of homogeneous and inhomogeneous synthetic series. These series were generated to reproduce the statistical characteristics of typical precipitations observed in the southern and central parts of the province of Quebec and nearby areas, Canada. It was found that none of these methods was efficient for all types of inhomogeneities, but some of them performed substantially better than others: the bivariate test, the Jaruskova's method, and the standard normal homogeneity test. Techniques such as the Student sequential test and the two-phase regression led to the worst performances. The analysis of the performances of each method in several situations allowed the design of an optimal procedure that takes advantage of the strengths of the best performing techniques.
- Published
- 2008
- Full Text
- View/download PDF
24. Regional estimation of parameters of a rainfall-runoff model at ungauged watersheds using the 'spatial' structures of the parameters within a canonical physiographic-climatic space
- Author
-
Taha B. M. J. Ouarda, Yeshewatesfa Hundecha, and András Bárdossy
- Subjects
Set (abstract data type) ,Watershed ,Basis (linear algebra) ,Kriging ,Calibration (statistics) ,Statistics ,Extrapolation ,Range (statistics) ,Canonical correlation ,Water Science and Technology ,Mathematics - Abstract
[1] A regionalization scheme by which parameters of a continuous rainfall-runoff model are estimated from physiographic and climatic watershed descriptors is presented. The approach makes use of the spatial structures displayed by the parameters within a physiographic-climatic space defined on the basis of a canonical correlation analysis between model parameters and watershed descriptors. Traditionally, regionalization has been performed using a two-step procedure of first estimating the model parameters in a set of subwatersheds independently and then establishing a relationship between the parameters thus estimated and a set of watershed descriptors. The approach presented in this paper follows a procedure by which the two steps are combined into one. The model is calibrated for the training subwatersheds with a dual objective of maximizing the model performance and achieving well-defined spatial structures of the parameters within the physiographic-climatic space. The model parameters in the subwatersheds that are not used for training are estimated from the optimum parameters obtained in the training set of subwatersheds using ordinary kriging within the physiographic-climatic space. The performance of the model in these subwatersheds is comparable to the performance in the training set obtained using the optimum parameters estimated through model calibration. The results also indicate the possibility of extrapolation of the model parameters under a situation where some of the watershed descriptors lie slightly outside the range within which the training was done.
- Published
- 2008
- Full Text
- View/download PDF
25. MultivariateL-moment homogeneity test
- Author
-
Taha B. M. J. Ouarda and Fateh Chebana
- Subjects
Multivariate statistics ,Multivariate analysis ,Gumbel distribution ,Homogeneity (statistics) ,Statistics ,Econometrics ,Bivariate analysis ,Marginal distribution ,Water Science and Technology ,L-moment ,Copula (probability theory) ,Mathematics - Abstract
[1] Several types of hydrological events are described with multivariate characteristics (droughts, floods, rain storms, etc.). When carrying out a multivariate regional frequency analysis for these events it is important to jointly consider all these characteristics. The aim of this paper is to extend the statistical homogeneity test of Hosking and Wallis (1993) to the multivariate case. As a tool, multivariate L-moments are used to define the statistics and general copula models to describe the statistical behavior of dependent variables. The usefulness of the methodology is illustrated on flood events. Monte-Carlo simulations are also performed for a bivariate Gumbel logistic model with Gumbel marginal distributions. Results illustrate the power of the proposed multivariate L-moment homogeneity test to detect heterogeneity on the whole structure of the model and on the marginal distributions. In a bivariate flood setting, a comparison is carried out with the classical homogeneity test of Hosking and Wallis based on several types of regions.
- Published
- 2007
- Full Text
- View/download PDF
26. Usefulness of the reversible jump Markov chain Monte Carlo model in regional flood frequency analysis
- Author
-
Eric Sauquet, Mathieu Ribatet, Taha B. M. J. Ouarda, and Jean Michel Grésillon
- Subjects
Index (economics) ,010504 meteorology & atmospheric sciences ,Computer science ,Bayesian probability ,Pooling ,0207 environmental engineering ,Estimator ,02 engineering and technology ,Reversible-jump Markov chain Monte Carlo ,01 natural sciences ,Shape parameter ,13. Climate action ,020701 environmental engineering ,Algorithm ,0105 earth and related environmental sciences ,Water Science and Technology - Abstract
Regional flood frequency analysis is a convenient way to reduce estimation uncertainty when few data are available at the gauging site. In this work, a model that allows a non-null probability to a regional fixed shape parameter is presented. This methodology is integrated within a Bayesian framework and uses reversible jump techniques. The performance on stochastic data of this new estimator is compared to two other models: a conventional Bayesian analysis and the index flood approach. Results show that the proposed estimator is absolutely suited to regional estimation when only a few data are available at the target site. Moreover, unlike the index flood estimator, target site index flood error estimation seems to have less impact on Bayesian estimators. Some suggestions about configurations of the pooling groups are also presented to increase the performance of each estimator.
- Published
- 2007
- Full Text
- View/download PDF
27. Bayesian multivariate linear regression with application to change point models in hydrometeorological variables
- Author
-
Jérôme Asselin, Ousmane Seidou, and Taha B. M. J. Ouarda
- Subjects
Multivariate adaptive regression splines ,Proper linear model ,Bayesian multivariate linear regression ,Statistics ,Econometrics ,Linear model ,Regression analysis ,Segmented regression ,Bayesian linear regression ,Regression diagnostic ,Water Science and Technology ,Mathematics - Abstract
[1] Multivariate linear regression is one of the most popular modeling tools in hydrology and climate sciences for explaining the link between key variables. Piecewise linear regression is not always appropriate since the relationship may experiment sudden changes due to climatic, environmental, or anthropogenic perturbations. To address this issue, a practical and general approach to the Bayesian analysis of the multivariate regression model is presented. The approach allows simultaneous single change point detection in a multivariate sample and can account for missing data in the response variables and/or in the explicative variables. It also improves on recently published change point detection methodologies by allowing a more flexible and thus more realistic prior specification for the existence of a change and the date of change as well as for the regression parameters. The estimation of all unknown parameters is achieved by Monte Carlo Markov chain simulations. It is shown that the developed approach is able to reproduce the results of Rasmussen (2001) as well as those of Perreault et al. (2000a, 2000b). Furthermore, two of the examples provided in the paper show that the proposed methodology can readily be applied to some problems that cannot be addressed by any of the above-mentioned approaches because of limiting model structure and/or restrictive prior assumptions. The first of these examples deals with single change point detection in the multivariate linear relationship between mean basin-scale precipitation at different periods of the year and the summer–autumn flood peaks of the Broadback River located in northern Quebec, Canada. The second one addresses the problem of missing data estimation with uncertainty assessment in multisite streamflow records with a possible simultaneous shift in mean streamflow values that occurred at an unknown date.
- Published
- 2007
- Full Text
- View/download PDF
28. Recursion-based multiple changepoint detection in multiple linear regression and application to river streamflows
- Author
-
Ousmane Seidou and Taha B. M. J. Ouarda
- Subjects
Markov chain ,Computer science ,Multivariable calculus ,Prior probability ,Bayesian probability ,Posterior probability ,Linear regression ,Statistics ,Linear model ,Recursion (computer science) ,Water Science and Technology - Abstract
[1] A large number of models in hydrology and climate sciences rely on multiple linear regression to explain the link between key variables. The relationship in the physical world may experiment sudden changes because of climatic, environmental, or anthropogenic perturbations. To deal with this issue, a Bayesian method of multiple changepoint detection in multiple linear regression is proposed in this paper. It is an adaptation of the recursion-based multiple changepoint method of Fearnhead (2005, 2006) to the classical multiple linear model. A new class of priors for the parameters of the multiple linear model is introduced, and useful formulas are derived that permit straightforward computation of the posterior distribution of the changepoints. The proposed method is numerically efficient and does not involve time consuming Monte-Carlo Markov Chain simulation as opposed to other Bayesian changepoint methods. It allows fast and straightforward simulation of the probability of each possible number of changepoints as well as the posterior probability distribution of each changepoint conditional on the number of changes. The approach is validated on simulated data sets and then compared to the methodology of Seidou et al. (2006) on two practical problems, as follows: (1) the changepoint detection in the multiple linear relationship between mean basin scale precipitation at different periods of the year and the summer-autumn flood peaks of the Broadback River located in Northern Quebec, Canada; and (b) the detection of trend variations in the streamflows of the Ogoki River located in the province of Ontario, Canada.
- Published
- 2007
- Full Text
- View/download PDF
29. Flood frequency analysis at ungauged sites using artificial neural networks in canonical correlation analysis physiographic space
- Author
-
Taha B. M. J. Ouarda and C. Shu
- Subjects
Statistics::Theory ,Artificial neural network ,Flood frequency analysis ,Generalization ,business.industry ,Computer Science::Neural and Evolutionary Computation ,Pattern recognition ,Space (mathematics) ,Kriging ,Statistics ,Artificial intelligence ,Canonical correlation ,business ,Jackknife resampling ,Water Science and Technology ,Mathematics ,Quantile - Abstract
[1] Models based on canonical correlation analysis (CCA) and artificial neural networks (ANNs) are developed to obtain improved flood quantile estimates at ungauged sites. CCA is used to form a canonical physiographic space using the site characteristics from gauged sites. Then ANN models are applied to identify the functional relationships between flood quantiles and the physiographic variables in the CCA space. Two ANN models, the single ANN model and the ensemble ANN model, are developed. The proposed approaches are applied to 151 catchments in the province of Quebec, Canada. Two evaluation procedures, the jackknife validation procedure and the split sample validation procedure, are used to evaluate the performance of the proposed models. Results of the proposed models are compared with the original CCA model, the canonical kriging model, and the original ANN models. The results indicate that the CCA-based ANN models provide superior estimation than the original ANN models. The ANN ensemble approaches provide better generalization ability than the single ANN models. The CCA-based ensemble ANN model has the best performance among all models in terms of prediction accuracy.
- Published
- 2007
- Full Text
- View/download PDF
30. Generalized maximum likelihood estimators for the nonstationary generalized extreme value model
- Author
-
S. El Adlouni, R. Roy, Xuebin Zhang, Taha B. M. J. Ouarda, and Bernard Bobée
- Subjects
Quadratic equation ,Scale (ratio) ,Location parameter ,Estimation theory ,Statistics ,Covariate ,Generalized extreme value distribution ,Estimator ,Water Science and Technology ,Mathematics ,Quantile - Abstract
[1] The objective of the present study is to develop efficient estimation methods for the use of the GEV distribution for quantile estimation in the presence of nonstationarity. Parameter estimation in the nonstationary GEV model is generally done with the maximum likelihood estimation method (ML). In this work, we develop the generalized maximum likelihood estimation method (GML), in which covariates are incorporated into parameters. A simulation study is carried out to compare the performances of the GML and the ML methods in the case of the stationary GEV model (GEV0), the nonstationary case with a linear dependence of the location parameter on covariates (GEV1), the nonstationary case with a quadratic dependence on covariates (GEV2), and the nonstationary case with linear dependence in both location and scale parameters (GEV11). Simulation results show that the GLM method performs better than the ML method for all studied cases. The nonstationary GEV model is also applied to a case study to illustrate its potential. The case study deals with the annual maximum precipitation at the Randsburg station in California, and the covariate process is taken to be the Southern Index Oscillation.
- Published
- 2007
- Full Text
- View/download PDF
31. A parametric Bayesian combination of local and regional information in flood frequency analysis
- Author
-
Pierre Bruneau, Taha B. M. J. Ouarda, Bernard Bobée, Ousmane Seidou, and Marc Barbet
- Subjects
Normal distribution ,Bayes' theorem ,Statistics ,Bayesian probability ,Econometrics ,Generalized extreme value distribution ,Estimator ,Log-linear model ,Extreme value theory ,Statistics::Computation ,Water Science and Technology ,Mathematics ,Quantile - Abstract
[1] Because of their impact on hydraulic structure design as well as on floodplain management, flood quantiles must be estimated with the highest precision given available information. If the site of interest has been monitored for a sufficiently long period (more than 30–40 years), at-site frequency analysis can be used to estimate flood quantiles with a fair precision. Otherwise, regional estimation may be used to mitigate the lack of data, but local information is then ignored. A commonly used approach to combine at-site and regional information is the linear empirical Bayes estimation: Under the assumption that both local and regional flood quantile estimators have a normal distribution, the empirical Bayesian estimator of the true quantile is the weighted average of both estimations. The weighting factor for each estimator is conversely proportional to its variance. We propose in this paper an alternative Bayesian method for combining local and regional information which provides the full probability density of quantiles and parameters. The application of the method is made with the generalized extreme values (GEV) distribution, but it can be extended to other types of extreme value distributions. In this method the prior distributions are obtained using a regional log linear regression model, and then local observations are used within a Markov chain Monte Carlo algorithm to infer the posterior distributions of parameters and quantiles. Unlike the empirical Bayesian approach the proposed method works even with a single local observation. It also relaxes the hypothesis of normality of the local quantiles probability distribution. The performance of the proposed methodology is compared to that of local, regional, and empirical Bayes estimators on three generated regional data sets with different statistical characteristics. The results show that (1) when the regional log linear model is unbiased, the proposed method gives better estimations of the GEV quantiles and parameters than the local, regional, and empirical Bayes estimators; (2) even when the regional log linear model displays a severe relative bias when estimating the quantiles, the proposed method still gives the best estimation of the GEV shape parameter and outperforms the other approaches on higher quantiles provided the relative bias is the same for all quantiles; and (3) the gain in performance with the new approach is considerable for sites with very short records.
- Published
- 2006
- Full Text
- View/download PDF
32. Modeling ice growth on Canadian lakes using artificial neural networks
- Author
-
Taha B. M. J. Ouarda, Pierre Bruneau, Ousmane Seidou, Laurent Bilodeau, Massoud Hessami, and André St-Hilaire
- Subjects
Early winter ,Artificial neural network ,Meteorology ,Climatology ,Cloud cover ,Air temperature ,Lake ice ,Environmental science ,Snow ,Physics::Atmospheric and Oceanic Physics ,Physics::Geophysics ,Water Science and Technology ,Ice thickness - Abstract
[1] This paper presents artificial neural network (ANN) models designed to predict ice in Canadian lakes and reservoirs during the early winter ice thickness growth period. The models fit ice thickness measurements at one or more monitored lakes and predict ice thickness during the growth period either at the same locations for dates without measurements (local ANN models) or at any site in the region (regional ANN model), provided that the required meteorological input variables are available. The input variables were selected after preliminary assessments and were adapted from time series of daily mean air temperature, rainfall, cloud cover, solar radiation, and average snow depth. The results of the ANN models compared well with those of the deterministic physics-driven Canadian Lake Ice Model (CLIMO) in terms of root-mean-square error and in terms of relative root-mean-square errors. The ANN models predictions were also marginally more precise than a revised version of Stefan's law (RSL), presented herein. They reproduced some intrawinter and interannual growth rate fluctuations that were not accounted for by RSL. The performance of the models results in good part from a careful choice of input variables, inspired from the work on deterministic models such as CLIMO. ANN models of ice thickness show good potential for the use in contexts where ad hoc adjustments are desirable because of the limited availability of measurements and where poor data nature, availability, and quality precludes using deterministic physics-driven models.
- Published
- 2006
- Full Text
- View/download PDF
33. Physiographical space-based kriging for regional flood frequency estimation at ungauged sites
- Author
-
Karem Chokmani and Taha B. M. J. Ouarda
- Subjects
Return period ,Coefficient of determination ,Flood myth ,Kriging ,Principal component analysis ,Statistics ,Canonical correlation ,Cross-validation ,Water Science and Technology ,Quantile ,Mathematics - Abstract
[1] A physiographical space-based kriging method is proposed for regional flood frequency estimation. The methodology relies on the construction of a continuous physiographical space using physiographical and meteorological characteristics of gauging stations and the use of multivariate analysis techniques. Two multivariate analysis methods were tested: canonical correlation analysis (CCA) and principal components analysis. Ordinary kriging, a geostatistical technique, was then used to interpolate flow quantiles through the physiographical space. Data from 151 gauging stations across the southern part of the province of Quebec, Canada, were used to illustrate this approach. In order to evaluate the performance of the proposed method, two validation techniques, cross validation and split-sample validation, were applied to estimate flood quantiles corresponding to the 10, 50, and 100 year return periods. Results of the proposed method were compared to those produced by a traditional regional estimation method using the canonical correlation analysis. The proposed method yielded satisfactory results. It allowed, for instance, for estimating the 10 year return period specific flow with a coefficient of determination of up to 0.78. However, this performance decreases with the increase in the quantile return period. Results also showed that the proposed method works better when the physiographical space is defined using canonical correlation analysis. It is shown that kriging in the CCA physiographical space yields results as precise as the traditional estimation method, with a fraction of the effort and the computation time.
- Published
- 2004
- Full Text
- View/download PDF
34. On the objective identification of flood seasons
- Author
-
Bernard Bobée, Taha B. M. J. Ouarda, and Juraj M. Cunderlik
- Subjects
Hydrology ,Water resources ,Identification (information) ,Hydrology (agriculture) ,Flood myth ,Flood forecasting ,medicine ,Environmental science ,Sampling (statistics) ,Seasonality ,Scale (map) ,medicine.disease ,Water Science and Technology - Abstract
[1] The determination of seasons of high and low probability of flood occurrence is a task with many practical applications in contemporary hydrology and water resources management. Flood seasons are generally identified subjectively by visually assessing the temporal distribution of flood occurrences and, then at a regional scale, verified by comparing the temporal distribution with distributions obtained at hydrologically similar neighboring sites. This approach is subjective, time consuming, and potentially unreliable. The main objective of this study is therefore to introduce a new, objective, and systematic method for the identification of flood seasons. The proposed method tests the significance of flood seasons by comparing the observed variability of flood occurrences with the theoretical flood variability in a nonseasonal model. The method also addresses the uncertainty resulting from sampling variability by quantifying the probability associated with the identified flood seasons. The performance of the method was tested on an extensive number of samples with different record lengths generated from several theoretical models of flood seasonality. The proposed approach was then applied on real data from a large set of sites with different flood regimes across Great Britain. The results show that the method can efficiently identify flood seasons from both theoretical and observed distributions of flood occurrence. The results were used for the determination of the main flood seasonality types in Great Britain.
- Published
- 2004
- Full Text
- View/download PDF
35. Comment on 'The use of artificial neural networks for the prediction of water quality parameters' by H. R. Maier and G. C. Dandy
- Author
-
Taha B. M. J. Ouarda, Vincent Fortin, and Bernard Bobée
- Subjects
Dandy ,Artificial neural network ,business.industry ,Water quality ,Artificial intelligence ,business ,Water Science and Technology ,Mathematics - Published
- 1997
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.