Methods Inf Med 2001; 40(01): 12-17
DOI: 10.1055/s-0038-1634458
Original Article
Schattauer GmbH

Nonparametric Selection Method of Survival Predictors with an Application to Breast Cancer

J. Vaillant
1   UFR Sciences, Université des Antilles-Guyane, Pointe-à-Pitre, Guadeloupe
,
M. Troupé
1   UFR Sciences, Université des Antilles-Guyane, Pointe-à-Pitre, Guadeloupe
,
J. Manuceau
1   UFR Sciences, Université des Antilles-Guyane, Pointe-à-Pitre, Guadeloupe
,
V. Lánska
2   Institute for Clinical and Experimental Medicine, Prague, Czech Republic
› Author Affiliations
Further Information

Publication History

Publication Date:
08 February 2018 (online)

Abstract

Mutual information between a survival variable and covariables provides a new tool for selecting covariables with a high predictive value whenever there is no reasonable parametric model with respect to the observed phenomena. The information rate carried out by covariables can be tested by means of a decomposition similar to the analysis of variance. Moreover, a method based on information conservation can be used for aggregating survival curves corresponding to different modalities of the same selected predictor which increases the prediction efficiency. These results are applied to survival data from 1304 patients with breast cancer followed over a period of ten years.

 
  • References

  • 1 Cox DR. Regression models and life-tables (with discussion). J Roy Statist Soc B 1972; 34: 187-220.
  • 2 Daley DJ, Vere-Jones D. An introduction to the theory of point processes. New York: Springer; 1988
  • 3 Darbellay GA. A nonparametric estimator for the mutual information. Proceedings of Stochastics ’98, Prague: 1998: 93-8.
  • 4 El Hasnaoui A. Le concept du gain d’information: une nouvelle approche en épidémiologie quantitative. Thèse de doct. – Univ. de Montpellier; 1993
  • 5 Kempton RA. The structure of species abundance and measurement of diversity. Biometrics 1979; 35: 307-21.
  • 6 Kent JT. Information gain and a general measure of correlation. Biometrika 1983; 70: 163-74.
  • 7 Kent JT, O’Quigley J. Measure of dependence for censored survival data. Biometrika 1988; 75: 525-34.
  • 8 Kullback S. Information theory and statistics. New York: Wiley; 1959
  • 9 Manuceau J, Troupe M, Vaillant J. Information and prognostic value of some variables on breast cancer. ESAIM 2000 in press.
  • 10 Manuceau J, Troupe M, Vaillant J. On an entropy conservation principle. J Appl Prob 1999; 36: 607-10.
  • 11 Mathai AM, Rathie PN. Characterization of Matusita’s measure of affinity. Ann Inst Statist Math 1972; 24: 473-82.
  • 12 Nayak TK. Sampling distributions in analysis of diversity. Sankhya series B 1986; 48: 1-9.
  • 13 Rao CR. Diversity: Its measurement, decomposition, apportionment and analysis. Sankhya Series A 1982; 44: 1-22.
  • 14 Rao CR. Generalization of ANOVA through entropy and cross entropy functions. In: Probability theory and mathematical statistics (vol 2). Utrecht: Sciences Press; 1986: 477-94.
  • 15 Rényi A. On measures of entropy and information. Proceedings of the fourth Berkeley Symposium on Mathematical Statistics and Probability 1961; 1: 547-61.
  • 16 Robert C. An entropy concentration theorem: applications in artificial intelligence and descriptive statistics. J Appl Prob 1990; 27: 303-13.
  • 17 Shannon CE. A mathematical theory of communication. Bell Syst Tech J 1948; 27: 379-423 and 623-56.