The Kurdyka-Lojasiewicz (KL) inequality is a fundamental tool for analyzing the convergence of various numerical methods in solving nonsmooth and nonconvex optimization problems. In this talk, we discuss recent developments on two aspects of the KL inequality. In the first part, we establish an abstract extended convergence framework that enables one to derive superlinear convergence towards a specific target set (such as the second-order stationary points) under a generalized metric subregularity condition, extending the widely used analyzing framework with KL inequality. We then show that this generalized metric subreguarity for secondorder stationary points can be ensured by KL inequality and the strict saddle point condition, which, in turn, is satisfied by several important applications easily. In the second part, we explain an approach for estimating the associated exponent (when it exists) in the KL inequality using a lift-and-project-approach. This enables us to estimate KL exponents for functions involving semi-definite programming representablity and C^2-cone reducible structures. As an application, we establish convergence analysis for cubic regularized Newton’s method with momentum steps. Specifically, when applying this method to solve the (nonconvex) over-parameterized compressed sensing model, we obtain a (local) quadratic convergence rate to a global minimizer, under the strict complementarity condition. In the absence of the strict complementarity condition, we obtain a sublinear convergence rate of O( 1/k^2 ) to a global minimizer. This is based on joint work with Boris Mordukhovich, Tingkei Pong, Peiran Yu and Jiangxing Zhu.