Part of a series on |
Machine learning and data mining |
---|
Paradigms |
Problems
|
Supervised learning (classification • regression) |
Clustering |
Dimensionality reduction |
Structured prediction |
Anomaly detection |
Artificial neural network |
Reinforcement learning |
Learning with humans |
Model diagnostics |
Mathematical foundations |
Journals and conferences |
Related articles |
In computer science, a logistic model tree (LMT) is a classification model with an associated supervised training algorithm that combines logistic regression (LR) and decision tree learning.
Logistic model trees are based on the earlier idea of a model tree: a decision tree that has linear regression models at its leaves to provide a piecewise linear regression model (where ordinary decision trees with constants at their leaves would produce a piecewise constant model). In the logistic variant, the LogitBoost algorithm is used to produce an LR model at every node in the tree; the node is then split using the C4.5 criterion. Each LogitBoost invocation is warm-started from its results in the parent node. Finally, the tree is pruned.
The basic LMT induction algorithm uses cross-validation to find a number of LogitBoost iterations that does not overfit the training data. A faster version has been proposed that uses the Akaike information criterion to control LogitBoost stopping.
References
- ^ Niels Landwehr; Mark Hall; Eibe Frank (2003). Logistic model trees (PDF). ECML PKDD.
- Landwehr, N.; Hall, M.; Frank, E. (2005). "Logistic Model Trees" (PDF). Machine Learning. 59 (1–2): 161–205. doi:10.1007/s10994-005-0466-3.
- ^ Sumner, Marc; Eibe Frank; Mark Hall (2005). Speeding up logistic model tree induction (PDF). PKDD. Springer. pp. 675–683.