About severe.im
Hi there.
Welcome to severe.im. Glad you found me. This is my crude weather page, where I plot satellite, radar, model, and surface data. I use many of these images in my daily teaching, but I'm happy to make them available publicly for anyone to access and share.
If you ever spot a problem, or want to reach out, there's always a contact form for that.
Below, some details about each of the sections of the main page -- what data is used, where it comes from, and so on. If you're looking at the main page from a phone, you might not realize that there are two columns (why I refer to left and right, below).
Satellite Imagery (Left Column)
Both sections of satellite imagery use data from NOAA's GOES-19 satellite. Each page is a JavaScript loop that uses the HTML5 AnimationS app (HAniS) from the University of Wisconsin.
Satellite channels used
For both sets of images, the channels used are:
- Visible: channel 2 (0.64 microns), the "red band"
- Infrared: channel 13 (10.3 microns), the "clean longwave window band"
- Water vapor: channel 8 (6.2 microns), the "upper-level tropospheric water band"
Main set of images
The main cluster of imagery uses data from a Unidata data feed. The files arrive in AREA format and are plotted in Gempak version 7.
"Legacy" set of images
The legacy images are downloaded directly from the NESDIS Center for Satellite Application and Research (STAR) and cropped to produce each sector.
Surface Station Plots (Left Column)
Surface data is received from the Unidata data feed, and plotted for various sectors. I produce three types of images: a "simple" plot that has no pressure data; an "advanced" plot that does include pressure data; and a "busy" plot that has a smaller font. In my first-year courses, I tend to skip the decoding (and contouring) of pressure, so the simple plots are quite accessible.
The first hourly run is at about :03 after the hour, and this is sometimes too early for the feed to have populated with enough data; the maps will look very bare sometimes. So I re-run at about :20 after the hour, and usually there is enough data that the map will look "normal" and full of stations.
Only the current hour is plotted. There are no archived images saved.
Radar Imagery (Right Column)
Each of the two radar sections uses a distinct source.
"Regional Radar Filtered"
This section uses the MRMS Composite Reflectivity data produced by NOAA/NSSL. It is quality controlled, and also has most of the non-meteorological returns removed. Most of the images are created in Gempak and with a black background, but I do produce two images in GrADS with a white background. The data are available thanks to NOAA's agreement with the AWS (Amazon) Data Exchange.
Instead of plotting the original units of dBZ, on these images I've used a written scale that roughly corresponds to my interpretation of that dBZ value. The two GrADS images also have an hourly rainfall estimate scale, corresponding to the standard Marshall-Palmer conversion.
"Regional Radar Unfiltered"
This section uses the DHR Digital Hybrid Reflectivity national mosaic. The data are received from a Unidata feed, which unfortunately (in summer 2025) has become increasingly unstable -- hence this is now a legacy source for the page. This mosaic is also unfiltered, and so substantial ground clutter is always present. That's good for detecting biological scatters, however, so even with the frequent outages I'll keep this data on the page for now.
This section preserves the original dBZ units of the data.
Forecast Model Output (Right Column)
The RAP and NAM sections are NCEP numerical weather prediction model output. The data are from NOAA and accessed thanks to their agreement with the AWS (Amazon) Data Exchange.
Current Hour RAP Fields
The 2-hour forecast ("F02") of the Rapid Refresh (RAP) model is used to represent the conditions of the "current hour." [reasoning] Almost all of the images are produced with GrADS, except for the isentropic charts, which are more easily produced in Gempak since it handles the pressure-to-theta coordinate transformations easily.
The most recent 24 hours are available, so users can cycle through the previous day for each field. No "future" data is plotted.
NAM Data
The North American Mesoscale (NAM) model output, at 40-km resolution (Grid 212), is plotted here. Maps are available every 6 hours (F00, F06, ...), up to the end of the model run at 84 hours. Only the 00Z and 12Z runs are plotted, and they overwrite the single viewing page.
Skew-T diagrams are plotted for selected cities. Also, I created a GrADS version of the classic "Weather" program written by Peter Neilley in the 1990's to print text output of some model fields for selected cities.
Hallway Display (Right Column)
This is the link to the hallway display we have on the fourth floor in our building on campus. The panels are not publicly changeable. That would be fun to implement in the future, though!
Additional Common Questions
What computing power does this take?
All of the data ingest, processing, and serving is done on a server that I pay for myself -- no resources from my work are used for this project (except for me logging in to fix something, while sitting at my desk at work sometimes). That's why you don't see any mention of my employer anywhere on the pages.
The server is hosted by Ionos; I've been with them since 2003 -- the days of "1and1" as a hosting provider. There isn't really that much data here, and no database pulls or anything, so I only need 50-60 gigabytes of space and a few gigabytes of RAM to serve everything efficiently.
What programming skill does something like this take?
The primary programming and scripting skills I use to maintain the pages are:
- Unix command line fluency (a skill just about everyone needs)
- Shell scripts (a skill just about everyone needs)
- Cron jobs (a skill just about everyone needs)
- GrADS (old, but still works great)
- Gempak (old, but still works great)
Some of my colleagues make fun of me for it but none of the images here are produced with Python.
I learned Gempak as an undergraduate, and so have written scripts in it for two decades now. The user base is dwindling, but those of us who have stuck around love its ease of use and command-line feel. There are also some very passionate, very vocal folks who have continued to maintain the package after Unidata dropped support for it.
GrADS is what my advisor preferred in graduate school and so I worked very deeply with it for several years. For the record, one of the best and most popular weather data websites in the world has all its images developed in GrADS, too. I'll let you try and figure out which one.
When did you start doing this?
Some of my model scripts go back to 2004 and honestly haven't changed much since then. A grad school colleague, Chris H., shared one of his scripts to calculate several of the fields and that was a huge help in getting started. Once you have your workflow figured out, and you know what colors and regions and data you want to use, the scripts don't need much changing.