In the standard scenario of multiuser detection, the maximum-likelihood (ML) detector is optimum in the sense of minimum error probability. Unfortunately, ML detection requires the solution of a difficult optimization problem for which it is unlikely that the optimal solution can be efficiently found. We consider an accurate and efficient quasi-ML detector which uses semi-definite relaxation (SDR) to approximate the ML detector. This SDR-ML detector was recently shown to be capable of achieving a bit error rate (BER) close to that of the true ML detector. Here, we show that several existing suboptimal detectors, such as the decorrelator, can be viewed as degenerate versions of the SDR-ML detector. Hence, it is expected that the SDR-ML detector should perform better than those detectors. This expectation is confirmed by simulations, where the BER performance of the SDR-ML detector is significantly better than that of other suboptimal detectors including the decorrelator and the linear-minimum-mean-square-error detector.