Skip to content
forked from lacava/few

a feature engineering wrapper for sklearn - This is the Dion Research fork to make this compatible with a modern version of scikit-learn. See branch: recent_scikit_learn_support

License

Notifications You must be signed in to change notification settings

dionresearch/few

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Build Status Code Health Coverage Status DOI

Dion Research Fork

NOTE:

If you ran into this repository, support for modern scikit-learn can be installed with:

pip install git+https://github.com/dionresearch/few.git@recent_scikit_learn_support

This will live here for now as the original repo is not accepting pull requests.

Few

Few is a Feature Engineering Wrapper for scikit-learn. Few looks for a set of feature transformations that work best with a specified machine learning algorithm in order to improve model estimation and prediction. In doing so, Few is able to provide the user with a set of concise, engineered features that describe their data.

Few uses genetic programming to generate, search and update engineered features. It incorporates feedback from the ML process to select important features, while also scoring them internally.

Install

You can use pip to install FEW from PyPi as:

pip install few

or you can clone the git repo and add it to your Python path. Then from the repo, run

python setup.py install

Mac users

Some Mac users have reported issues when installing with old versions of gcc (like gcc-4.2) because the random.h library is not included (basically this issue). I recommend installing gcc-4.8 or greater for use with Few. After updating the compiler, you can reinstall with

CC=gcc-4.8 python setupy.py install

Usage

Few uses the same nomenclature as sklearn supervised learning modules. Here is a simple example script:

# import few
from few import FEW
# initialize
learner = FEW(generations=100, population_size=25, ml = LassoLarsCV())
# fit model
learner.fit(X,y)
# generate prediction
y_pred = learner.predict(X_unseen)
# get feature transformation
Phi = learner.transform(X_unseen)

You can also call Few from the terminal as

python -m few.few data_file_name 

try python -m few.few --help to see options.

Examples

Check out few_example.py to see how to apply FEW to a regression dataset.

Publications

If you use Few, please reference our publications:

La Cava, W., and Moore, J.H. A general feature engineering wrapper for machine learning using epsilon-lexicase survival. Proceedings of the 20th European Conference on Genetic Programming (EuroGP 2017), Amsterdam, Netherlands. preprint

La Cava, W., and Moore, J.H. Ensemble representation learning: an analysis of fitness and survival for wrapper-based genetic programming methods. GECCO '17: Proceedings of the 2017 Genetic and Evolutionary Computation Conference. Berlin, Germany. arxiv

Acknowledgments

This method is being developed to study the genetic causes of human disease in the Epistasis Lab at UPenn. Work is partially supported by the Warren Center for Network and Data Science. Thanks to Randy Olson and TPOT for Python guidance.

About

a feature engineering wrapper for sklearn - This is the Dion Research fork to make this compatible with a modern version of scikit-learn. See branch: recent_scikit_learn_support

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Python 92.4%
  • C++ 5.4%
  • Shell 2.0%
  • Makefile 0.2%