Skip to content

New Use Case: TC Verification Compare ADECK vs BDECK #911

@JohnHalleyGotway

Description

@JohnHalleyGotway

Describe the New Use Case

The National Hurricane Center organizes their track data into separate files for each storm. A historical archive of this data is available via the NHC ftp site:
https://ftp.nhc.noaa.gov/atcf/archive/

For past hurricane evaluations, the DTC ran the MET tc-pairs tool once for each input storm. METplus team members met with Yan Jin from the NOAA/HAFs group on 5/4/21. She is also processing data from NHC organized in this way. Recommend that we add a new use case to demonstrate the this type of processing logic.

The user defines a 4 digit year (e.g. 2019) and the basin(s) they'd like to process (AL or EP)? We could also have them explicitly specify a list of storm numbers to process. For example, there were 20 storms in the AL basin in 2019, numbered 01 to 20. This use case should loop through the storm numbers listed. For each, identify the adeck track (e.g. aal072019.dat for storm number 07), the bdeck track (e.g. bal072019.dat), and the edeck track (e.g. eal072019.dat). Then call tc_pairs passing these single files as input.

We probably do not need to process all 20 AL storms for this use case. Instead, consider processing 3 AL storms and 3 EP storms. And we probably don't need to process all available forecast tracks. We could limit it to something like: OFCL, HWRF, HMON, and GFSO.

Ideally, after processing all the storms in the list, we would then call METplotpy to plot the track and intensity errors for those storms. However, that functionality may not yet exist. We could also consider calling METdbload to load the .tcst output files into a MET database.

Acceptance Testing

Recommend having DTC hurricane scientists evaluate this use case (@KathrynNewman, @mrinalbiswas, or @evankalina) to confirm that it's working as expected.

Recommend asking Yan Jin to also test this out.
List input data types and sources.
Describe tests required for new functionality.

Time Estimate

Estimate the amount of work required here.
Issues should represent approximately 1 to 3 days of work.

Sub-Issues

Consider breaking the new feature down into sub-issues.

  • Add a checkbox for each sub-issue here.

Relevant Deadlines

List relevant project deadlines here or state NONE.

Funding Source

Define the source of funding and account keys here or state NONE.

Define the Metadata

Assignee

  • Select engineer(s) or no engineer required
  • Select scientist(s) or no scientist required

Labels

  • Select component(s)
  • Select priority
  • Select requestor(s)
  • Select privacy

Projects and Milestone

  • Review projects and select relevant Repository and Organization ones or add "alert:NEED PROJECT ASSIGNMENT" label
  • Select milestone to next major version milestone or "Future Versions"

Define Related Issue(s)

Consider the impact to the other METplus components.

New Use Case Checklist

See the METplus Workflow for details.

  • Complete the issue definition above, including the Time Estimate and Funding source.
  • Fork this repository or create a branch of develop.
    Branch name: feature_<Issue Number>_<Description>
  • Complete the development and test your changes.
  • Add/update log messages for easier debugging.
  • Add/update unit tests.
  • Add/update documentation.
  • Push local changes to GitHub.
  • Submit a pull request to merge into develop.
    Pull request: feature <Issue Number> <Description>
  • Define the pull request metadata, as permissions allow.
    Select: Reviewer(s), Project(s), Milestone, and Linked issues
  • Iterate until the reviewer(s) accept and merge your changes.
  • Delete your fork or branch.
  • Close this issue.

Metadata

Metadata

Assignees

Type

No type

Projects

No projects

Milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions