This article is rated Start-class on Wikipedia's content assessment scale. It is of interest to the following WikiProjects: | |||||||||||||||||||||
|
Daily pageviews of this article
A graph should have been displayed here but graphs are temporarily disabled. Until they are enabled again, visit the interactive graph at pageviews.wmcloud.org
|
Yet does not really discuss regression trees (which are an analytical technique not a procedural technique) at all. If I wanted to find out what a regression tree or a classification tree was I would not find this article particularly helpful. — Preceding unsigned comment added by 75.146.224.18 (talk) 00:21, 21 December 2011 (UTC)Reply
all the following part had no connection at all with the article: totally out of context
what has this got to do with the subject?
if this is a "n-derivate" of the subject, is it??? it should have its own encylcopedic entry displaying the full "path" development / explanation.
introduction, meaning context etc if it can't be explained , it can't be accepted ??:
"Creation of decision nodes
Three popular rules are applied in the automatic creation of classification trees. The Gini rule splits off a single group of as large a size as possible, whereas the entropy and twoing rules find multiple groups comprising as close to half the samples as possible. Both algorithms proceed recursively down the tree until stopping criteria are met.
The Gini rule is typically used by programs that build ('induce') decision trees using the CART algorithm. Entropy (or information gain) is used by programs that are based on the C4.5 algorithm. A brief comparison of these two criterion can be seen under Decision tree formulae.
More information on automatically building ('inducing') decision trees can be found under Decision tree learning."
User:misteror Nov 17, 2008 —Preceding unsigned comment added by 217.23.162.212 (talk) 15:17, 17 November 2008 (UTC)Reply
Woud like to learn more about the software, especially if open source web based or client binaries exist. —Preceding unsigned comment added by 130.207.180.93 (talk) 21:00, 12 December 2007 (UTC)Reply
This article talks about the advantages to using decision trees but shouldn't it also include disadvantages too? User:noneforall October 14, 2007
I think the note below is right on -- I'm amazed this entry hasn't been fixed.
My thought on a fix is that most of the decision tree entry needs to be moved to a more specific category ---- maybe decision tree learning. The data mining form of decision tree learning could be linked from a corrected decision tree page, within a parenthetical note about the confusion over terminology. Influence diagrams and decision analysis need to be referenced. Etc. I agree...there's do definition of what the diagrams mean--their spacing, colors, numbers, etc. I'm going to add a template and flag this article. Maybe it will get fixed then. RCanine 14:21, 6 April 2007 (UTC)Reply
Someone should probably point out the Z criterion (sqrt(positive weight * negative weight)), which is used by AdaBoost (Schapire and Singer). Earlier, it was analyzed by Kearns + Mansour (IIRC) in the case where example weights are uniform, and they cited Quinlan as first proposing it.
"...is a white box model" - Ahahahaha! The hilarity of the mental processes which lead anyone to think up the concept of a "white box" has brightened my day.
I'm shocked that there is no mention of decision trees as a decision aid - where the expected values of various choices are calculated. This is what I understand as a Decision Tree - the stuff about their use in data mining is only of secondary importance to my mind.
For example a factory manager has to decide to invest in product A or product B (she cannot do both due to budget constrants). Product A is estimated to require two million pounds (or dollars if you like) of R&D investment, but only has a 50% chance of the research being successful and a product being obtained. It will then have a 30% chance of making a $5M profit, a 40% chance of making a $10M profit, and a 30% chance of not selling at all and making a loss of £1M for the manuafacturing costs. Product B on the other hand will cost $3M in R&D but has an 80% chance of making a $4M profit and a 20% chance of a $2M loss. If the company has a policy of maximising expected values, which should she go for?
This is just an example off the top of my head, but a more domestic example is of someone deciding to rent or buy their own house, along with a capital gain or loss depending on where house prices go and what the cost of renovation (or "fixing up" I think in AmEng) will be.
Decision trees are taught to teenage business students in the UK, but none of them would recognise this article. Decision trees are an example of an operations researchormanagement science method.
The most important part of the article has been left out!
I'd also like to add that the highly mathematical formal description of decision trees is not going to be understood by most readers. Articles like this need to start with a very simple example that everyone can understand. --62.253.44.188 15:08, 6 August 2006 (UTC)Reply
Decision trees are also important in machine learning, not just management science. It would be good to see this distinction elaborated on in the article. There also needs to be more elaboration (or links to other articles) on constructing decision trees - mentioning ID3 and C4.5 is a start. Also, what about the example provided? How is the threshold value of 70 chosen for humidity? This seems wrong.
I tried to add a decision tree software to the list as it was in keeping with other links, why would informavores not qualify for entry on this page? —The preceding unsigned comment was added by Louharris (talk • contribs) 09:29, 3 April 2007 (UTC).Reply
I've done a few decision trees and I think the example given is confusing, especially without clarification on which colors mean what. I think perhaps a simpler example would be nice to start in order to illustrate the principle. fsiler 19:36, 30 July 2007 (UTC)Reply
Probability tree redirects here but needs its own entry as in maths it's something different: a diagram illustrating possible outcomes from a series of events. It isn't a decision tree as you can't decide the steps, they occur as the result of chance, eg coin toss. The secondary school maths curriculum of numerous English-speaking countries such as New Zealand specifies this usage and students will come to Wikipedia looking for information on them. Strayan (talk) 06:30, 30 June 2008 (UTC)Reply
A section should be added on software decision trees like the one available in SAS Enterprise Miner. A good resource is Barry de Ville's Decision Trees for Business Intelligence and Data Mining: Using SAS Enterprise Miner. SAS Institute, Inc. Cary, NC. 2006. La9rsemar (talk) 18:30, 2 September 2009 (UTC)Reply
Perhaps it's worth noting that decision trees can be regarded as a limited form of more expressive techniques, e.g. algorithmsorMarkov chains. Rp (talk) 14:36, 9 December 2009 (UTC)Reply
Surely the online examples given aren't really the same as what is described in the article. Decision trees are used to help on a decision not help with navigation. —Preceding unsigned comment added by 87.194.30.19 (talk) 13:31, 30 January 2010 (UTC) The text starts with 10 000 000 when the tree shows 33 333 and explains about the bold line going from nodes 1, 3, 5 and so on when nowhere is those kind of numbers. The text should be removed and a new explanation should be done. —Preceding unsigned comment added by 84.248.180.73 (talk) 13:34, 21 April 2011 (UTC)Reply
There was some great shareware decision tree software back in the days of DOS. I've forgotten its name and cannot find it - anyone know anymore? 92.24.186.101 (talk) 12:46, 23 December 2010 (UTC)Reply
Unless anyone has good reasons for objecting, I intend to delete the non-decision tree material, and other material which appears to be personal research as far as I am aware. I think the following should be deleted: flow diagram (appears to be personal research), influence diagrams, utility preferences (text does not connect it with decision trees), references to AI and genetic algorithms (which are not referenced in the text as far as I can see), minor cleaning up. 92.23.38.246 (talk) 22:00, 20 April 2013 (UTC)Reply
The file File:RiskPrefSensitivity2Threshold.png (at right) currently in this article is intended to demonstrate that for certain values, Product A is superior, whereas for others, Product B is superior. The article notes that they are even at $400K. However, the graph is zoomed out in such a way that the reader has to look very closely to see this. Also, to maximize accessibility to Wikipedia users with lower reading levels, it would be great if the axis was labeled such that each $100K, including $400K, was labeled on the axis, rather than just each $1M. This graph should be replaced by another one which is more clear. Unfortunately, the creator of this graph, A m sheldon, is no longer active on Wikipedia. Sondra.kinsey (talk) 14:02, 11 November 2017 (UTC)Reply
This article has been targeted by an (apparent) campaign to insert "Decision Stream" into various Wikipedia pages about machine learning. "Decision Stream" refers to a recently published paper that currently has zero academic citations. [1] The number of articles that have been specifically edited to include "Decision Stream" within the last couple of months suggests conflict-of-interest editing by someone who wants to advertise this paper. They are monitoring these pages and quickly reverting any edits to remove this content.
Known articles targeted:
BustYourMyth (talk) 19:18, 26 July 2018 (UTC)Reply
References
Dear BustYourMyth,
Your activity is quite suspiciase: registration of the user just to delete the mention of the one popular article. Peaple from different contries with the positive hystory of Wikipedia improvement are taking place in removing of your commits as well as in providing information about "Decision Stream".
Kind regards, Dave — Preceding unsigned comment added by 62.119.167.36 (talk) 13:33, 27 July 2018 (UTC)Reply
I asked for partial protection at WP:ANI North8000 (talk) 17:08, 27 July 2018 (UTC)Reply
This edit request has been answered. Set the |answered= or|ans= parameter to no to reactivate your request. |
https://en.wikipedia.org/wiki/Decision_tree_learning should be added to the "See also" section. Painted desert (talk) 06:04, 16 August 2018 (UTC)Reply
This edit request has been answered. Set the |answered= or|ans= parameter to no to reactivate your request. |
link "tree-like" in the head section to Tree (Graph Theory) 76.183.236.210 (talk) 19:31, 18 September 2018 (UTC)Reply
Decision trees are used in field guides to determine tree species. For examples of DTs in taxonomy determination, see https://www.researchgate.net/publication/309126688_Supervised_Machine_Learning_for_Plants_Identification_Based_on_Images_of_Their_Leaves, http://ceur-ws.org/Vol-1178/CLEF2012wn-ImageCLEF-CeruttiEt2012.pdf , https://books.google.kz/books?id=X3bTKkpZ58wC&pg=PA82&lpg=PA82&dq=decision+tree+botanical+determination&source=bl&ots=eBu6beh5k2&sig=ACfU3U3FQs7c1rhcGh_Cfe9r2oY_nvQO9A&hl=ru&sa=X&ved=2ahUKEwicmtKB9bTpAhXss4sKHenqB3kQ6AEwD3oECAYQAQ#v=onepage&q=decision%20tree%20botanical%20determination&f=false , https://www.ijirae.com/volumes/Vol2/iss6/16.JNAE10093.pdf (p. 114), https://www.jove.com/science-education/10070/tree-identification-how-to-use-a-dichotomous-key (here called a dicotomous key and shown as a table, but basically the same as a two-choice DT) 37.99.32.95 (talk) 03:38, 15 May 2020 (UTC)Reply
… Decision trees trace their origins to the era of the early development of written records. Kdammers (talk) 01:46, 19 July 2022 (UTC)Reply