Machine Learning

Complete the form to build your model.
PID
113633

Q1
On a scale from 1 to 10, how would you rate 'Saghav Gourisankar' (സഖാവ് ഗൗരിശങ്കർ)?
Integer - 127862
Q2
Would you recommend 'Saghav Gourisankar' (സഖാവ് ഗൗരിശങ്കർ)?
Single choice - 127863

Page of

Loading form...

Dataset
Samples
Features
113633
0
5

A sample is created when a user votes on a poll. We store the sample for a month and then delete it. So, each sample you load here is less than a month old. Do you need this data for longer than a month? Export the dataset to save the samples on your device.

Loading form...

Regression algorithm used to build the model.
Classification algorithm used to build the model.
Clustering algorithm used to build the model.
Dimensionality reduction algorithm used to build the model.
Maximum degree of the polynomial.
Number of trees in the forest. Enter an integer between 1 and 1,000.
Number of features to consider when looking for the best split. Enter an integer between 1 and the number of X variables selected.
Minimum number of samples required to be at a leaf node. Enter an integer between 1 and the number of train samples.
Maximum depth of each tree. Enter an integer between 1 and 20.
Method used to calculate a prediction from the output of each tree.
Probability distribution used in the model.
Number of nearest neighbors used to make a prediction. Enter an integer between 1 and the number of train samples.
Number of clusters to form.
Number of iterations used to train the model. Enter an integer between 1 and 100,000.
Activation function used in each hidden layer.
Layer 1 nodes
Number of hidden layers in the model.
Method used to select the initial position of each cluster centroid.
Metric used to compute distance.
Creates the random number generator used to build the model. Enter an integer between 1 and 1,000,000,000 for a deterministic build. Leave the input blank for a random build.
0.01
Rate at which the model learns the weight and bias parameters.
0.0001
Regularization prevents overfitting the model to the data.
0.0001
Error tolerance for the model.
Whether to calculate the intercept for the model.
Whether bootstrap aggregating is used to build the trees.
Centering subtracts the mean from each data point.
Scaling divides each data point by the standard deviation.
Variable
Build
Preview
Chart
X
Y


Page of
Train
Test
Split the samples into train and test subsets. Train samples build the model. Test samples evaluate the performance of the model.
Shuffling changes the order of the samples in a random way.
Creates the random number generator used to shuffle the samples. Enter an integer between 1 and 1,000,000,000 for a deterministic shuffle. Leave the input blank for a random shuffle.

Load data to build a model.

Type:
Algorithm:
Date:
Time:
Dataset:
Header:
Rows:
Columns:
X variables:
Y variable:
Classes:
Degree:
Estimators:
Max features:
Min leaf samples:
Max tree depth:
Selection method:
Distribution:
Neighbors:
Clusters:
Iterations:
Activation:
Hidden layers:
Initialization:
Distance:
Seed:
Learning rate:
Regularization:
Tolerance:
Intercept:
Bagging:
Center data:
Scale data:
Shuffle samples:
Shuffle seed:
Power:
Y = A X B
A:
B:

Linear:
log(Y) = B log(X) + log(A)
Exponential:
Y = A e B X
A:
B:

Linear:
log(Y) = B X + log(A)
coef std err t P > |t|
coef
DF residuals:
DF model:
F-statistic:
Prob (F-statistic):
R-squared:
Adjusted R-squared:
Residual std err:
Residuals:
Min
1Q
Median
3Q
Max

Tree #

Structure:
Tree height:
Internal nodes:
Leaf nodes:
Gain function:
Split function:
Parent node
Current node
Height:
Depth:
Left:
= <
Right:
=
Left child node
Right child node
Prior probability
Bias
Cluster size
Cluster error
Iterations completed:
Converged:
Eigenvalues
Explained variance
Cumulative variance
Train samples:

OOB
Max error
MAE
MedAE
MAPE
MSE
RMSE
MSLE
Predicted
OOB
Accuracy
F1 Score
TPR
TNR
PPV
NPV
Out-of-bag samples:
Model
Baseline
Max error
MAE
MedAE
MAPE
MSE
RMSE
MSLE
Predicted
Model
Baseline
Accuracy
F1 Score
TPR
TNR
PPV
NPV
Test samples:

Input:

Output:

Model
Baseline

Build a model to view results.