Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Messages - adamzs

Pages: 1 2 3 [4] 5 6
46
Hi Rezvan,

Thank you for reaching out to us, it is good to hear that you find Pelicun helpful for your work.

Based on the error message, I assume the problem with the shortened file in your first question is that the names of certain attributes under the GeneralInformation part of the file do not follow the standard naming convention we introduced across our tools last summer. I assume the workshop you refer to was held before that date. Since those standard names were introduced, pelicun 2.6 and later versions look for NumberOfStories and PlanArea under GeneralInformation and does not accept other versions of these attributes anymore. Please take a look at the shortened file and edit it if needed so that it follows these conventions. That should help you get past the error.

Several approaches are available to analyze a large number of buildings:
- If this is a regional analysis, i.e., the buildings are in a geographical context with a location assigned to each, then I suggest using our R2D Tool or the rWHALE backend to run the analysis.
- If this is more of a parametric study on a large set of archetypes, then you can do one of the following:
  = As you mentioned, you could prepare a dakota.json file for each building. I know grad students who do this through MatLab (without using PBE at all) by printing out a text file and then running Pelicun as an application directly from MatLab. Nevertheless, I agree with you that this approach is far from efficient.
  = A better way to handle this would be to import pelicun in a Python script and use it as a library rather than as an application. PBE uses the DL_calculation.py script to run pelicun as an application. If you take a look at that script (it's under tools in the pelicun package), you'll see how pelicun is imported and how the various methods in the library are called. Calling these directly will make your code more efficient, but you'd still need to prepare input files and read output files if you are using pelicun 2.6. Also note that the dakota.json input file is just a dictionary under Python. So, you can prepare a dictionary and save it to a json file using the json package in Python.
  = One of the major changes in pelicun 3 is a redesign of how researchers can interact with the library exactly to support the use case that you have. You do not need to prepare an input file and you can get the outputs directly as Python objects to stay within Python for the entire analysis and make large calculations much more efficient. I will present these features next Friday (Feb 18) during our Live Expert Tips session. If you are interested, I encourage you to register and participate in the event. Here is a link: https://designsafe-ci.zoom.us/meeting/register/tJYpdOGuqTgrHt3FR0yM7dxmCYf6kiEx5Btm

Let me know if you have further questions.

Adam

PS. I hope you don't mind if we delete the copy of your question that was posted on the PBE board.

47
Regional Hazard Simulation (R2D, rWhale) / Re: openquake selection widget
« on: February 08, 2022, 08:42:35 PM »
Thank you, Anne; we are happy to help and the entire team really appreciates your positive feedback!

48
Hi Mariana,

The algorithm we use in the background to assign a Seismic Design Level based on the Year of Construction assumes that the region of interest is in a UBC Zone 4 in California. We use Table 5-35 from the Hazus Technical manual (see a copy attached below) as the basis of the assignment. It would be relatively straightforward for us to extend our script to support all zones listed in Table 5-35 and allow you to select the zone that applies to your analysis. However, the break points in that table correspond to major code benchmark years in California and I noticed that your site is in Texas, near El Paso. To get realistic results, you would need an assignment logic that corresponds to the design eras in that area. These eras might be building-specific, like we have special rules for W1 buildings in California. You can prepare such logic in a python script and feed it to the workflow in R2D; we use such scripts for examples 7 and 8 to analyze hurricane impacts, see here in the DL panel: https://nheri-simcenter.github.io/R2D-Documentation/common/user_manual/examples/desktop/E7HurricaneWindWater/README.html

I see a few different options going forward and we are happy to support you regardless of which one you choose:

- You can manually modify the year of construction for your URM buildings to be less than 1975. This will avoid them getting assigned to High-Code and let the analysis run. I suggest giving this a try just to confirm that you can run a regional analysis in R2D. Note that the results you get with this approach are not at all realistic, I do not recommend using them for anything but testing purposes.

- We can extend our algorithm to support all Zones in Table 5-35 and you can pick a Zone that you think represents the conditions in your target area. Depending on the evolution of design codes in your target area, this might work. Please let us know if you find this feature helpful for your work so that we can add it to the list of requests we have. I anticipate this extension to fit in for the next release of R2D that we are planning for late March. If you need to have it earlier than that, we can probably provide you a developer version of the tool in a few weeks that has this feature.

- Finally, you can prepare your own algorithm to assign Hazus fragility curves to buildings based on the local design evolution in El Paso. If you plan to use the results for serious research, I recommend going down this route. Let us know if you are interested and we are happy to provide you guidance on how to prepare such a script.

I hope this helps.

Adam


49
Hi Scott,

Thank you for reaching out to us, I will help you figure out what goes wrong in the simulation.

The error you received could stem from a problem in the OpenSees simulation or in the damage and loss assessment. First, I suggest excluding the possibility of an OpenSees issue.

Please navigate to your working directory (C:/Users/KPP/Documents/PBE/LocalWorkDir), then go to the tmp.SimCenter folder and you should see a log.txt file there. If you shared that txt file here, it could help me a lot in understanding the issue better.

Thank you,
Adam

50
Thank you for the additional information.

As far as I see, you are interested in structural response estimation. The extensions you propose to the StandardEarthquakeEDP and MDOF tools are great and they would be appreciated by other members of the research community.

I have a few suggestions:

- As long as you focus on EDPs, it would probably be easier to use the EE-UQ tool instead of PBE. PBE adds an extra step to the workflow that you don't seem to use now. Once you have the extension with the new EDPs and base isolator added, you will be able to return to PBE and take advantage of those changes.

- Our office hours have been superseded by live expert tips that we hold every Friday. They include a presentation (the tips) and an open discussion around that topic afterward. Our next EE-UQ presentation is a few weeks out, so I suggest continuing the discussion here and perhaps setting up a Zoom call if needed.

First, I suggest you look at EE-UQ, try to run the workflow there, and let me know how that goes. If the individual modules work, EE-UQ should also run well.

Adam

51
Hi Ahmed,

Thank you for reaching out to us.

Please tell me more about the extension you are working on.

I see that you have modified the response estimation parts of the workflow, but you will also need to edit pelicun (in the performDL folder) to make sure your new EDP is properly considered in the damage and loss assessment. Except if you are adding an EDP that is already handled by pelicun (you can see the list of EDPs pelicun currently supports here: https://nheri-simcenter.github.io/pelicun/common/user_manual/usage/pelicun/res_data.html).

Can you share more information on how you tested each modification you made individually; and what example you are trying to run when you assemble the entire workflow?

Thank you,
Adam

52
Hi Emily,

Ground failure currently handles damage assessment due to liquefaction. This type of assessment is supported using damage models from the Hazus earthquake methodology. These models use horizontal and vertical permanent ground displacement as input and provide a probability of extensive or complete damage as output. The calculation of PGD values is not yet integrated into our workflow; we expect researchers to provide those intensity measures as input.

Example 5 in R2D is based on the work of Tamika Bassman, Jaewon Saw, and Sijin Wang. If you are interested in how they calculated PGD, I am happy to introduce you to them.

Adam

53
Damage & Loss (PELICUN) / Re: EDP keys
« on: October 11, 2021, 08:46:59 PM »
Hi Ioannis,

Did you manage the get PFV to work?

If not, I would appreciate it if you could share more information about the environment you use so that we can find the reason behind the error you experience and fix it.

Thank you

55
I am relaying a feature request from a researcher at Stanford:

They are using R2D to run IM-based risk assessment and they would appreciate if they could generate more than 99 fields using R2D. Currently, they cannot put in a larger number under "Number of ground motions per site" in the HAZ tab. Would it be possible to remove that constraint?

56
Thanks for the detailed response! It is reassuring to know that the docs are helpful and that the error was only due to this detail.

Let us know if you run into any other problems.

57
Damage & Loss (PELICUN) / Re: EDP keys
« on: September 28, 2021, 05:13:42 AM »
Hi Ioannis,

PFV should work and your label seems correct. It is not even a new feature, so I am surprised to see a bug there.

Which version of pelicun do you use? (You can check it in the __init__.py file, or by importing pelicun and running the pelicun.__version__ command in Python)

58
Hi Ioannis,

I am happy to hear that your latest builds performed as expected and it is also good to know that PBE builds fine on Ubuntu - although the backend is platform-independent, we only test the frontends on Mac and Windows.

If you don't mind, I would like to learn more about the error you experienced when switching from HAZUS to FEMA P58. Which paths in the preferences were involved in this?
I assume it is the Backend Applications. It is important to have a valid path for the Backend Applications because the default fragility data is stored as part of the backend. When choosing FEMA P58, the fragility data is loaded to populate the component list in the Components tab under DL. Now, if you provide an invalid path to the backend applications, I can imagine that the tool fails when trying to load the defaults.
I would appreciate it if you could confirm that this was indeed the case in your experience or provide more information to help me better understand the error.

Thank you!

59
Hi Ioannis,

Thank you for reporting this strange behavior and sharing your input files. I tried to reproduce the problem and I got correct results for both cases.

Are you using the latest version of PBE?
You can find the latest version here (the documentation also points to this location): https://www.designsafe-ci.org/data/browser/public/designsafe.storage.community/%2FSimCenter%2FSoftware%2FPBE

If you are using the latest version, I would appreciate it if you could share the results generated by the application in the first (same_file) case. The results are available in the working directory; the location of that directory is specified under File/Preferences/Local Jobs Directory.

Thank you!

60
Hi Derek,

Thank you for that suggestion; the debris estimation method from HAZUS-MH can be included in Pelicun to provide the quantity of generated debris as another decision variable. I have added it to the requested features and will incorporate it in the next major release - expect it to be available in the second half of this year.

Pages: 1 2 3 [4] 5 6