Modern optimization --- both in its driving theory and in its classical and contemporary algorithms ---- is illuminated by geometry. I will present two case studies of this approach. The first example --- seeking a common point of two convex sets by alternately projecting onto each --- is classical and intuitive, and widely applied (even without convexity) in applications like image processing and low-order control. I will sketch some nonconvex cases, and relate the algorithm's convergence to the intrinsic geometry of the two sets. The second case study revolves around ``partly smooth manifolds" --- a geometric generalization of the active set notion fundamental to classical Nonlinear optimization. I emphasize examples from eigenvalue optimization. Partly smooth geometry opens the door to acceleration strategies for first-order methods, and is central to sensitivity analysis. Reassuringly, given its power as a tool, this geometry is present generically in semi-algebraic optimization problems. |