The Volterra series expansion has well-documented merits for modeling smooth nonlinear systems. Given that nature itself is parsimonious and models with minimal degrees of freedom are attractive from a system identification viewpoint, estimating sparse Volterra models is of paramount importance. Based on input-output data, existing estimators of Volterra kernels are sparsity agnostic because they rely on standard (possibly recursive) least-squares approaches. Instead, the present contribution develops batch and recursive algorithms for estimating sparse Volterra kernels using the least-absolute shrinkage and selection operator (Lasso) along with its recent weighted and online variants. Analysis and simulations demonstrate that weighted (recursive) Lasso has the potential to obviate the "curse of dimensionality," especially in the under-determined case where input-output data are less than the number of unknowns dictated by the order of the expansion and the memory of the kernels.