Matches in Nanopublications for { <https://doi.org/10.3233/DS-240059> ?p ?o ?g. }
Showing items 1 to 43 of
43
with 100 items per page.
- DS-240059 type ScholarlyWork provenance.
- DS-240059 type ResourcePaper assertion.
- DS-240059 type ResourcePaper assertion.
- DS-240059 type ResourcePaper assertion.
- DS-240059 type ResourcePaper assertion.
- DS-240059 type ResourcePaper assertion.
- DS-240059 type ResourcePaper assertion.
- DS-240059 title "Measuring Data Drift with the Unstable Population Indicator" assertion.
- DS-240059 title "Measuring Data Drift with the Unstable Population Indicator" assertion.
- DS-240059 title "Measuring Data Drift with the Unstable Population Indicator" assertion.
- DS-240059 title "Measuring Data Drift with the Unstable Population Indicator" assertion.
- DS-240059 title "Measuring Data Drift with the Unstable Population Indicator" assertion.
- DS-240059 title "Measuring Data Drift with the Unstable Population Indicator" assertion.
- DS-240059 date "2024" assertion.
- DS-240059 date "2024" assertion.
- DS-240059 date "2024" assertion.
- DS-240059 date "2024" assertion.
- DS-240059 date "2024" assertion.
- DS-240059 date "2024-06-26" assertion.
- DS-240059 authoredBy 0000-0003-2581-8370 assertion.
- DS-240059 authoredBy 0000-0003-2581-8370 assertion.
- DS-240059 authoredBy 0000-0003-2581-8370 assertion.
- DS-240059 authoredBy 0009-0003-5030-0108 assertion.
- DS-240059 authoredBy 0009-0003-5030-0108 assertion.
- DS-240059 authoredBy 0009-0003-5030-0108 assertion.
- DS-240059 isPartOf 2451-8492 assertion.
- DS-240059 isPartOf 2451-8492 assertion.
- DS-240059 isPartOf 2451-8492 assertion.
- DS-240059 isPartOf 2451-8492 assertion.
- DS-240059 isPartOf 2451-8492 assertion.
- DS-240059 isPartOf 2451-8492 assertion.
- DS-240059 abstract "Measuring data drift is essential in machine learning applications where model scoring (evaluation) is done on data samples that differ from those used in training. The Kullback-Leibler divergence is a common measure of shifted probability distributions, for which discretized versions are invented to deal with binned or categorical data. We present the Unstable Population Indicator, a robust, flexible and numerically stable, discretized implementation of Jeffrey's divergence, along with an implementation in a Python package that can deal with continuous, discrete, ordinal and nominal data in a variety of popular data types. We show the numerical and statistical properties in controlled experiments. It is not advised to employ a common cut-off to distinguish stable from unstable populations, but rather to let that cut-off depend on the use case." assertion.
- DS-240059 abstract "Measuring data drift is essential in machine learning applications where model scoring (evaluation) is done on data samples that differ from those used in training. The Kullback-Leibler divergence is a common measure of shifted probability distributions, for which discretized versions are invented to deal with binned or categorical data. We present the Unstable Population Indicator, a robust, flexible and numerically stable, discretized implementation of Jeffrey's divergence, along with an implementation in a Python package that can deal with continuous, discrete, ordinal and nominal data in a variety of popular data types. We show the numerical and statistical properties in controlled experiments. It is not advised to employ a common cut-off to distinguish stable from unstable populations, but rather to let that cut-off depend on the use case." assertion.
- DS-240059 abstract "Measuring data drift is essential in machine learning applications where model scoring (evaluation) is done on data samples that differ from those used in training. The Kullback-Leibler divergence is a common measure of shifted probability distributions, for which discretized versions are invented to deal with binned or categorical data. We present the Unstable Population Indicator, a robust, flexible and numerically stable, discretized implementation of Jeffrey's divergence, along with an implementation in a Python package that can deal with continuous, discrete, ordinal and nominal data in a variety of popular data types. We show the numerical and statistical properties in controlled experiments. It is not advised to employ a common cut-off to distinguish stable from unstable populations, but rather to let that cut-off depend on the use case." assertion.
- DS-240059 abstract "Measuring data drift is essential in machine learning applications where model scoring (evaluation) is done on data samples that differ from those used in training. The Kullback-Leibler divergence is a common measure of shifted probability distributions, for which discretized versions are invented to deal with binned or categorical data. We present the Unstable Population Indicator, a robust, flexible and numerically stable, discretized implementation of Jeffrey's divergence, along with an implementation in a Python package that can deal with continuous, discrete, ordinal and nominal data in a variety of popular data types. We show the numerical and statistical properties in controlled experiments. It is not advised to employ a common cut-off to distinguish stable from unstable populations, but rather to let that cut-off depend on the use case." assertion.
- DS-240059 abstract "Measuring data drift is essential in machine learning applications where model scoring (evaluation) is done on data samples that differ from those used in training. The Kullback-Leibler divergence is a common measure of shifted probability distributions, for which discretized versions are invented to deal with binned or categorical data. We present the Unstable Population Indicator, a robust, flexible and numerically stable, discretized implementation of Jeffrey's divergence, along with an implementation in a Python package that can deal with continuous, discrete, ordinal and nominal data in a variety of popular data types. We show the numerical and statistical properties in controlled experiments. It is not advised to employ a common cut-off to distinguish stable from unstable populations, but rather to let that cut-off depend on the use case." assertion.
- DS-240059 abstract "Measuring data drift is essential in machine learning applications where model scoring (evaluation) is done on data samples that differ from those used in training. The Kullback-Leibler divergence is a common measure of shifted probability distributions, for which discretized versions are invented to deal with binned or categorical data. We present the Unstable Population Indicator, a robust, flexible and numerically stable, discretized implementation of Jeffrey's divergence, along with an implementation in a Python package that can deal with continuous, discrete, ordinal and nominal data in a variety of popular data types. We show the numerical and statistical properties in controlled experiments. It is not advised to employ a common cut-off to distinguish stable from unstable populations, but rather to let that cut-off depend on the use case." assertion.
- DS-240059 hasPart RA0XRooQKz2A7aoP0VJLS2NKcvQv-n7RwPoYtcD4wtTPc assertion.
- DS-240059 hasPart RA0XRooQKz2A7aoP0VJLS2NKcvQv-n7RwPoYtcD4wtTPc assertion.
- DS-240059 hasPart RA0XRooQKz2A7aoP0VJLS2NKcvQv-n7RwPoYtcD4wtTPc assertion.
- DS-240059 hasPart RA4SqymT32eltSYbr41lDKMBV3Zr8nEBEXRFhfOrN6f3k assertion.
- DS-240059 hasPart RA4SqymT32eltSYbr41lDKMBV3Zr8nEBEXRFhfOrN6f3k assertion.
- DS-240059 hasPart RA4SqymT32eltSYbr41lDKMBV3Zr8nEBEXRFhfOrN6f3k assertion.