Author Topic: Coupled damage - Loss models in PELICUN  (Read 8382 times)


  • Newbie
  • *
  • Posts: 1
    • View Profile
Coupled damage - Loss models in PELICUN
« on: May 27, 2022, 06:28:03 AM »
I need some clarification about how are the damage and loss models coupled in PELICUN to get a coupled damage and loss model. I assume some ratios are worked out which when multiplied and summed over the fragility curves, gives the expected loss curve. As such these ratios should be same for similar building attributes. Can you please provide some explanation or clarification about the same.
I have been reading the report on Lake Charles testbed on Designsafe, where the same approach has been used.


  • Moderator
  • Jr. Member
  • *****
  • Posts: 84
    • View Profile
Re: Coupled damage - Loss models in PELICUN
« Reply #1 on: May 28, 2022, 02:15:55 AM »
Hi Asim,

Thank you for your interest in Pelicun and for asking this question on the Forum.

Let me give you a high-level overview first and I am happy to answer any questions on the details later.

As you probably know, Hazus uses multilinear functions for wind damage and loss assessment. Both function types are defined by a series of values that correspond to damage or loss conditioned on wind speeds that increase in 5 mph increments from 50 mph to 250 mph.

We extracted the above data from Hazus and parsed it for 5076 building archetypes x 5 different terrain roughnesses.

For each archetype-terrain combination:
    - For each damage state:
        - We ran two optimizations to find the best fitting normal and lognormal CDF to the data. The objective function for the fitting was the sum os squared errors at the discrete points (every 5 mph) from 50 mph to 200 mph. Note that we did not consider wind speeds above 200 mph because we were concerned about the robustness of the data in that domain.
        - The distribution family with the smaller error was chosen from these two results.
        - The error magnitude was saved and later reviewed for the entire database. For the vast majority of the data, the fit is almost perfect. I can provide quantitative details if you are interested.
    - Now that we have the fragility functions as CDFs, we calculated the probability of being in each damage state at each of the 5 mph increments from 50 to 200 mph.
    - We ran an optimization where the unknowns were the loss ratios assigned to the first three damage states. The fourth damage state was always assigned a loss ratio of 1.0 (i.e., total loss). The loss ratio assigned to each wind speed is the expected loss, that is, the sum of the product of each damage state's likelihood and the corresponding loss ratio.
    - This optimization was a bit more tricky because we had to add constraints to make sure the loss ratios are monotonically increasing. The objective function used the sum of squared errors between the Hazus losses and our model's losses at each 5 mph increment from 50 mph to 200 mph.
    - The fit was great in most cases, but we found some archetypes where the fragility curves and the loss curves were in such disagreement that their coupling with the above method was only possible with considerable error. We believe the curves we produced for these cases represent a more realistic behavior and consequences than the ones in the Hazus database. Again, I am more than happy to elaborate if you are interested.

The fragility and loss data is available in the developer branch of Pelicun:
    - Fragilities:
    - Losses:

I plan to compile a similar database with the raw Hazus data to facilitate benchmark studies that compare the two approaches.

Let me know if you have further questions.