Finite mixture regression (FMR) models are powerful modeling tools to analyze data of various types because of FMR's flexible model structure and appealing interpretation. Applications can be found in a variety of areas, such as economics, finance and clinical trails. In this report, we focus on the logisticnormal mixture models, which allow both the mixing parameters and the mean parameters to vary with covariates. We show the consistency of the parameter estimation based on a penalized maximum likelihood method, when the sample size increases but the number of the covariates is fixed. We then consider the case where the number of the potential covariates grows with the sample size. In this setting, the likelihood function becomes non-convex, and the traditional techniques for analyzing model selection with high dimensional data do not work. We show under appropriate conditions that the LASSO type estimators of the FMR models remain consistent in terms of Kullback-Leibler divergence.
Keywords: logistic-normal, penalized maximum likelihood, Kullback-Leibler divergence.