Dispersion modeling has happily moved forward over the past 40 years starting with such models as PTMAX and PTDIS, moving to the UNAMAP modeling center and getting models on tape, to websites with tremendous amounts of information modeling methods, models, and data sets. But has any progress really been made? All modeling has been transformed to faster and faster computing systems that are smaller and smaller and can be carried in a small backpack rather than the behemoth Cray monster computers of the past. What used to take a month to set up can now be rolled up in a few hours by hotshot engineering jockeys with computer skills we never even dreamed of back in the '70s and '80s. But has our science improved? Do some of the more recent modelers understand that we are still building models and methods on flimsy Gaussian theory that was a stop gap measure back in the early '70s to quench the fires of the Clean Air Act and the mandates required for industrial sources to meet new compliance measures? Are the limits of the application of these atmospheric physics worth continued promotion or should we move on to something new? This paper in a short 20 minute presentation will overview the historical timeline of the physics, the models, the NAAQS compliance directives and try to answer these in a light, direct presentation.