Sufficient dimension folding is the technology to reduce the dimensions of matrix- or array-valued objects as well as keep their data structure. In this talk, I consider the sufficient dimension folding for the regression mean function when predictors are matrix- or array-valued. I propose the concept of central mean folding subspace and its two local estimation methods: folded outer product of gradients estimation (folded-OPG) and folded minimum average variance estimation (folded-MAVE). The asymptotic property for folded-MAVE is established. Also, I focus on the sufficient dimension folding for the regression on robustness. I introduce central functional dimension folding subspace and a class of estimation methods on robust estimators. Special attention is paid to the central quantile dimension folding subspace, a widely interesting case of the central functional folding subspace.