Twitter | Search | |
CatBoostML
Official account for CatBoost, 's open-source gradient boosting library w/categorical features support
186
Tweets
32
Following
622
Followers
Tweets
CatBoostML Jun 8
ICML starts soon! We have a talk tomorrow June 8 at 12 PM at expo. Listen to an algo overview and ask your questions!
Reply Retweet Like
CatBoostML retweeted
PyConRu Jun 7
Станислав Кириллов поделится на рецептами эффективного использования в python коде, позволяющими учить модели быстро и избегать перерасхода памяти и лишних преобразований данных. Полные тезисы доклада →
Reply Retweet Like
CatBoostML Jun 7
vs XGBoost on Two Sigma competition
Reply Retweet Like
CatBoostML Jun 6
There are few ways to build a tree in . The default one for is SymmetricTree, it is the best type for dense noisy datasets. For datasets with little noise and for one-hot encoding it is better to use Depthwise and Lossguilde trees. Explore to improve your quality!
Reply Retweet Like
CatBoostML Jun 6
Replying to @Kostoglodov
Do you need it? What is the scenario how it will be used?
Reply Retweet Like
CatBoostML Jun 5
can be applied in C++, Python, R, .NET, Java and Rust, and exported to CoreML, PMML, ONNX and json. Did we miss anything?
Reply Retweet Like
CatBoostML Jun 3
allows to compare several models visually in notebook by looking on their train and validation curves.
Reply Retweet Like
CatBoostML May 30
It is possible to use a model on your iphone. See an example in our tutorial
Reply Retweet Like
CatBoostML May 30
Replying to @amjad_aburmileh
Yes, here it is:
Reply Retweet Like
CatBoostML May 29
0.15 contains a new way to understand and debug your model: feature plots that show average label and prediction for each feature value, and how prediction changes if a feature is changed.
Reply Retweet Like
CatBoostML May 28
Replying to @billehsarkozy
We are working on it right now. It will be out in the next release. It will not have full categorical features support though, since we need hash tables for them.
Reply Retweet Like
CatBoostML May 28
Leaf index prediction is used in and many other places. See details of how to use it in our new 0.15 tutorial
Reply Retweet Like
CatBoostML May 27
New release is out! It contains leaf index prediction, C# and Rust appliers, Expectile regression and Huber loss and many more changes, see
Reply Retweet Like
CatBoostML May 22
This month is full of contributions from external people to : new model analysis, Huber loss and Expectile regression, tree visualization, Rust, C#, CoreML one-hot, shap speedup! Thanks very much guys!
Reply Retweet Like
CatBoostML May 22
Welcome the most waited block in our docs - FAQ , which contains answers on many nontrivial questions.
Reply Retweet Like
CatBoostML May 17
The new release is coming soon! Let us know if there is something we can add for you!
Reply Retweet Like
CatBoostML May 16
If you like and want to discuss and there, we are now present in , , and .
Reply Retweet Like
CatBoostML May 16
We have active discussions on our channels and . Join us, ask questions, write answers about !
Reply Retweet Like
CatBoostML May 15
docs have recently moved to . It's easier to search and navigate now!
Reply Retweet Like
CatBoostML May 14
Replying to @BillMutoro
It is allowed to have missing feature values in your data. You need to use nan value or if you are reading from file, you can use "", N/A or any other nan representation.
Reply Retweet Like