DZone
Big Data Zone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
  • Refcardz
  • Trend Reports
  • Webinars
  • Zones
  • |
    • Agile
    • AI
    • Big Data
    • Cloud
    • Database
    • DevOps
    • Integration
    • IoT
    • Java
    • Microservices
    • Open Source
    • Performance
    • Security
    • Web Dev
DZone > Big Data Zone > Logistic Regression vs. Decision Tree

Logistic Regression vs. Decision Tree

Find the best method for classification based on your data.

Shital Kat user avatar by
Shital Kat
·
Aug. 08, 19 · Big Data Zone · Tutorial
Like (6)
Save
Tweet
25.83K Views

Join the DZone community and get the full member experience.

Join For Free

When to Use Each Algorithm

Logistics Regression (LR) and Decision Tree (DT) both solve the Classification Problem, and both can be interpreted easily; however, both have pros and cons. Based on the nature of your data choose the appropriate algorithm.

Of course, at the initial level, we apply both algorithms. Then, we choose which model gives the best result. But have you ever thought of why a particular model is performing best on your data?

Let's look at some aspects of data.

Is Your Data Linearly Separable?

Logistic Regression assumes that the data is linearly (or curvy linearly) separable in space.Separable in space

Separable in space


Decision Trees are non-linear classifiers; they do not require data to be linearly separable.

Non-linearly separable data

Non-linearly separable data



When you are sure that your data set divides into two separable parts, then use a Logistic Regression. If you're not sure, then go with a Decision Tree. A Decision Tree will take care of both.

Check Data Types

Categorical data works well with Decision Trees, while continuous data work well with Logistic Regression.

If your data is categorical, then Logistic Regression cannot handle pure categorical data (string format). Rather, you need to convert it into numerical data.

  1. Enumeration: If we enumerate the labels eg. Mumbai — 1, Delhi — 2, Bangalore — 3, Chennai — 4, then the algorithm will think that Chennai (2) is twice large as Mumbai (1).

  2. One Hot Encoding: For the above problem, use One Hot Encoding; however, this could result in a Dimension problem. Therefore, if you have lots of categorical data, go with a Decision Tree.

Is Your Data Highly Skewed?

Decision Trees handle skewed classes nicely if we let it grow fully.

Eg. 99% data is +ve and 1% data is –ve

Highly skewed data in a Decision Tree

Highly skewed data in a Decision Tree


So, if you find bias in a dataset, then let the Decision Tree grow fully. Don’t cut off or prune branches. Instead, identify max depth according to the skew.

Logistic Regression does not handle skewed classes well. So, in this case, either increase the weight to the minority class or balance the class.

Does Your Data Contain Outliers?

Logistic regression will push the decision boundary towards the outlier.

Ignoring and moving toward the outlier

Ignoring and moving toward outliers


While a Decision Tree, at the initial stage, won't be affected by an outlier, since an impure leaf will contain nine +ve and one –ve outlier. The label for the leaf will be +ve, since the majority are positive.

However, if we let the Decision Tree grow fully, the signal will mote to one side, while the outlier will be moved to the other — there will be one leaf for each outlier. 

Does Your Data Contain Many Missing Values?

Logistic Regression does not handle missing values; we need to impute those values by mean, mode, and median.

If there are many missing values, then imputing those may not be a good idea, since we are changing the distribution of data by imputing mean everywhere.

Decision Trees works with missing values.

Cheatsheet.

Linearly non - separable? Decision Tree
Categorical Data type Decision Tree
Continuous Data type Logistic Regression        
Skewed

Decision Tree, or give high weight to minority class in Logistic Regression

Outlier Decision Tree, or remove outlier for Logistic Regression
Lots of Missing values Decision Tree
Decision tree Tree (data structure) Data (computing)

Opinions expressed by DZone contributors are their own.

Popular on DZone

  • Progressive Delivery With Argo Rollouts: Blue-Green Deployment
  • Common Types Of Network Security Vulnerabilities In 2022
  • How to Translate Value to Executives Using an Outcome-Driven Mindset
  • Secure Proxy for HIPAA-Compliant API Analytics

Comments

Big Data Partner Resources

X

ABOUT US

  • About DZone
  • Send feedback
  • Careers
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • MVB Program
  • Become a Contributor
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 600 Park Offices Drive
  • Suite 300
  • Durham, NC 27709
  • support@dzone.com
  • +1 (919) 678-0300

Let's be friends:

DZone.com is powered by 

AnswerHub logo