Search this site
Embedded Files
AIMD GPDS Courses
  • Home
  • Courses
  • Contact
AIMD GPDS Courses
  • Home
  • Courses
  • Contact
  • More
    • Home
    • Courses
    • Contact

Watch Fullscreen ❯

◈ EN | JP ◆

Lesson 14    ❮    Lesson List    ❮    Top Page


◆  Intro to Feature Selection


◆  Statistics-based Methods 


◆  Dimensionality Reduction-based


◈  Information Theory-based


◆  Wrapper-type Methods


◆  Genetic Algorithm-based


⟐  Implementation

See also the following links:

▎ENGLISH

Information Gain and Mutual Information for Machine Learning
▸ https://machinelearningmastery.com/information-gain-and-mutual-information/

Information gain ratio - Wikipedia
▸ https://en.wikipedia.org/wiki/Information_gain_ratio

Chi-Square Test for Feature Selection in Machine Learning
▸ https://towardsdatascience.com/chi-square-test-for-feature-selection-in-machine-learning-206b1f0b8223


▎日 本 語

情報利得と利得比の計算
▸ https://support.sas.com/documentation/cdl_alternate/ja/vaug/68027/HTML/default/p1ae9n099ugyl0n0zb6ovq28cg2h.htm

カイ二乗値を用いた特徴選択 - 人工知能に関する断創録
▸ https://aidiary.hatenablog.com/entry/20100625/1277470153

カイ二乗検定を用いたカテゴリ変数の特徴量選択
▸ https://qiita.com/strg-frontier/items/a48459c4082ac3df7829


©2023. All rights reserved.  Samy Baladram,
Graduate Program in Data Science - GSIS - Tohoku University
Google Sites
Report abuse
Page details
Page updated
Google Sites
Report abuse