In printed circuit boards (PCBs) and integrated circuits (ICs), transmission lines are designed to route digital or analog signals between driver and receiver components. Any transmission line should be designed to minimize signal distortion and ringing, although achieving multiple design goals is not always feasible. Prior discussions of transmission line optimization have focused on linking experimentally observed signal behavior to empirical circuit models (e.g., RLGC(f) model) with Lorentzian or wideband Debye dispersion. While these circuit models and dispersion models are correct to a high degree of accuracy, an integrated framework that links digital signal behavior in a PCB back to interconnect geometry in the presence of dispersion and copper roughness for causal interconnects with ~10 GHz bandwidth is lacking. In this paper, the link between transmission line geometry and causal digital signal behavior, and a procedure for optimizing transmission line geometry, will be presented and discussed. Geometric optimization is performed using differential evolution while accounting for causal signal behavior and dispersion in the dielectric throughout the signal bandwidth. Numerical results for microstrips show that differential evolution can be used to optimize interconnect geometry, and the procedure presented here can easily be incorporated into modern CAD applications. The procedure shown here can be used with other transmission line geometries and analog signals with well-defined bandwidth. Signal distortion metrics and S-parameters could also be incorporated as additional objective functions.