Clustering is a basic task in data analysis and machine learning, and the optimization of clustering objectives are well-studied optimization problems; amongst these, the \(k\)-Means objective is arguably the most well known. Given a collection of points in a metric space, the goal is to partition them into \(k\) clusters, each with an associated center, so as to minimize the sum of squared distances of points to their cluster centers. In this paper, we present a polynomial-time \(3+2\sqrt2+ε<5.83\)-approximation algorithm for \(k\)-Means in general metrics. This substantially improves on the current-best \((9+ε)\)-approximation in [Ahmadian, Norouzi-Fard, Svensson, Ward - FOCS’17, SICOMP’20], and even slightly improves on the \(5.92\)-approximation in [Cohen-Addad, Esfandiari, Mirrokni, Narayanan - STOC’22] for the Euclidean special case. A natural approach for \(k\)-Means is to leverage Lagrangian Multiplier Preserving (LMP) approximations for the facility location problem. The previous best results for \(k\)-Means build upon an adaptation of an LMP \(3\)-approximation for facility location with metric connection costs in [Jain, Vazirani - J.ACM’01] based on a primal-dual method, rather than on the improved LMP greedy \(2\)-approximation for the same problem in [Jain, Mahdian, Markakis, Saberi, Vazirani - J.ACM’03]. The barrier to using the improved LMP algorithm was that no adaptation of this algorithm and its analysis to the case of squared metric connection costs was known (since squared distances violate triangle inequality). Our main contribution is overcoming this barrier by providing such an adaptation. This new LMP approximation algorithm is then combined with the framework recently introduced in [Cohen-Addad, Grandoni, Lee, Schwiegelshohn, Svensson - STOC’25] for the related (metric) \(k\)-Median problem.