Care/i, scusandomi con chi l’aveva già notata, vi segnalo questa issue di Statistical Science, quasi interamente dedicata alla regressione sparsa (regolarizzazione in l1 e l0). Cari saluti, Gianluca Da: Project Euclid <
> Contents: Dimitris Bertsimas, Jean Pauphilet, Bart Van Parys. Sparse Regression: Scalable Algorithms and Empirical Performance. 555--578. Trevor Hastie, Robert Tibshirani, Ryan Tibshirani. Best Subset, Forward Stepwise or Lasso? Analysis and Recommendations Based on Extensive Comparisons. 579--592. Owais Sarwar, Benjamin Sauk, Nikolaos V. Sahinidis. A Discussion on Practical Considerations with Sparse Regression Methodologies. 593--601. Rahul Mazumder. Discussion of “Best Subset, Forward Stepwise or Lasso? Analysis and Recommendations Based on Extensive Comparisonsâ€Â. 602--608. Edward I. George. Modern Variable Selection in Action: Comment on the Papers by HTT and BPV. 609--613. Yuansi Chen, Armeen Taeb, Peter Bühlmann. A Look at Robustness and Stability of $\ell_{1}$-versus $\ell_{0}$-Regularization: Discussion of Papers by Bertsimas et al. and Hastie et al.. 614--622. Dimitris Bertsimas, Jean Pauphilet, Bart Van Parys. Rejoinder: Sparse Regression: Scalable Algorithms and Empirical Pe rformance. 623--624. Trevor Hastie, Robert Tibshirani, Ryan J. Tibshirani. Rejoinder: Best Subset, Forward Stepwise or Lasso? Analysis and Recommendations Based on Extensive Comparisons. 625--626. Michael Schweinberger, Pavel N. Krivitsky, Carter T. Butts, Jonathan R. Stewart. Exponential-Family Models of Random Graphs: Inference in Finite, Super and Infinite Population Scenarios. 627--662. Richard D. De Veaux. A Conversation with J. Stuart (Stu) Hunter. 663--671. To unsubscribe from this mailing, log in to your MyEuclid account and navigate to the "Email Alerts" section of your profile. |
Archivio con motore MhonArc 2.6.16.