Existing approaches to the Maximum-Likelihood (ML) detection problem in digital communications either suffer from exponential complexity (e.g. Sphere Decoder and its variants) or exhibit significant Bit-Error-Rate (BER) degradation (e.g. LMMSE Detector). In this paper we present an efficient implementation of a semi-definite relaxation-based detector (SDR Detector) which can achieve nearoptimal BER performance with worst-case polynomial complexity. This implementation (available online) can be 100 times faster than an off-the-shelf SeDuMi-based implementation, outperforms Sphere Decoder in low Signal-to-Noise Ratio (SNR) or high dimension regimes, and matches the speed of Sphere Decoder in the high SNR regime. The core of the detector is an optimized dual-scaling interiorpoint method (implemented in C) for the relaxed semi-definite program. SNR-sensitive improvements are achieved by a dimension reduction strategy and a warm start technique based on a truncated version of the Sphere Decoding algorithm. Extensive numerical simulations show that the BER performance and the running time of SDR Detector compare favorably to that of other near-optimal detection strategies.