machine learning - better results from simple linear regression than multivariate/multiple reg -


i have existing model predicts house prices, uses simple linear regression. input have date , output price.

i wanted improve overall results have added 1 more feature. new feature distance estimated property.

problem multiple/multivariate regressions performs bit worse simple regression. (all data normalised)

do have ideas why happening , how can approach this?

there dozens of possible reasons, list few:

  • if new feature barely correlated trying predict - efficiently injecting noise system cannot expect better performance
  • if have few data points more features can lead harder problem
  • since using linear model, if new feature predictor, relation not linear dependent variable - model fail well
  • linear regression such naive model, ridge/lasso regression might change result (especially lasso since deals better bad features)

Comments

Popular posts from this blog

java - SSE Emitter : Manage timeouts and complete() -

jquery - uncaught exception: DataTables Editor - remote hosting of code not allowed -

java - How to resolve error - package com.squareup.okhttp3 doesn't exist? -