Skip to content
#

gpu

Here are 1,931 public repositories matching this topic...

H2O is an Open Source, Distributed, Fast & Scalable Machine Learning Platform: Deep Learning, Gradient Boosting (GBM) & XGBoost, Random Forest, Generalized Linear Modeling (GLM with Elastic Net), K-Means, PCA, Generalized Additive Models (GAM), RuleFit, Support Vector Machine (SVM), Stacked Ensembles, Automatic Machine Learning (AutoML), etc.

  • Updated Dec 5, 2020
  • Jupyter Notebook
rsn870
rsn870 commented Aug 21, 2020

Hi ,

I have tried out both loss.backward() and model_engine.backward(loss) for my code. There are several subtle differences that I have observed , for one retain_graph = True does not work for model_engine.backward(loss) . This is creating a problem since buffers are not being retained every time I run the code for some reason.

Please look into this if you could.

dominicshanshan
dominicshanshan commented Nov 19, 2020

Is your feature request related to a problem? Please describe.
from time series analysis, need to support spearman correlation matrix calculation in cuDF

Describe the solution you'd like
similar like pandas.DataFrame().corr(method='spearman')

Additional context
is it possible to let me know this feature adding roadmap? big thanks !

jankrynauw
jankrynauw commented Jun 6, 2019

We would like to forward a particular 'key' column which is part of the features to appear alongside the predictions - this is to be able to identify to which set of features a particular prediction belongs to. Here is an example of predictions output using the tensorflow.contrib.estimator.multi_class_head:

{"classes": ["0", "1", "2", "3", "4", "5", "6", "7", "8", "9"],
 "scores": [0.068196

Improve this page

Add a description, image, and links to the gpu topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the gpu topic, visit your repo's landing page and select "manage topics."

Learn more

You can’t perform that action at this time.