摘要:At present, the most reliable information for inferring storm-time ground electric fields along electrical transmission lines comes from coarsely sampled, national-scale magnetotelluric (MT) data sets, such as that provided by the EarthScope USArray program. An underlying assumption in the use of such data is that they adequately sample the spatial heterogeneity of the surface relationship between geomagnetic and geoelectric fields. Here, we assess the degree to which the density of MT data sampling affects geoelectric hazard assessments. For electrical transmission networks in each of four focus regions across the contiguous United States, we perform two parallel band-limited (101–103 s) hazard analyses: one using only USArray-style (∼70-km station spacing) MT data, and one incorporating denser (≪70-km station spacing) MT data. We find that the use of USArray-style MT sampling alone provides a useful first-order estimate of integrated geoelectric fields along electrical transmission lines. However, we also find that the use of higher density MT data can in some areas lead to order-of-magnitude differences in line-averaged electric field estimates at the level of individual transmission lines and can also yield significant differences in subregional hazard patterns. As we demonstrate using variogram plots, these differences reflect short-spatial-scale variability in Earth conductivity, which in turn reflects regional lithotectonic structure and history. We also provide a cautionary example in the use of electrical conductivity models to predict dense MT data; although valuable for hazard applications, models may only be able to reproduce surface geoelectric fields as captured by the MT data from which they were derived.