Skip to content
# Graded Advanced Features II Quiz

### Week – 4 Graded Advanced Features II Quiz >>> How to Win a Data Science Competition Learn from Top Kagglers

(5000, 5)
(5, 5000)
(53, 5)

Bag-of-words matrix
Standartized matrix
One-Hot encoded feature

Exactly A + B
Exactly A * B
Less or equal to A * B
max(A, B)

f1 + f2
f1.astype(str) + f2.astype(str)
f1.astype(str) + “_” + f2.astype(str)
(f1 + f2).astype(str)

Apply t-SNE to the train and after that to the test.
Apply t-SNE to the test first and after to train.
Apply t-SNE to concatenation of train and test and split projection back.
Doesn’t matter, all variants will produce the same result.

Yes, why not.
No, only 2-dim or 3-dim projections are possible.
## Similar Posts

### Mean encodings

### Validation

### Feature preprocessing and generation with respect to models

### Ensembling

### Exploratory data analysis

### All Assignment of How to Win a Data Science Competition

1. Imagine that we apply X = PCA(n_components=5).fit_transform(data) and data has shape (5000, 53). What is the shape of X?

1 point

2. To which data NMF is NOT applicable?

1 point

3. Suppose we have 2 categorical features: **f1** with __A__ possible values and **f2** with __B__ possible values. How many values will their interaction have?

1 point

4. Imagine we have 2 categorical features represented as integers: **f1** with all values in range [0, 1000] and **f2** with values in range [0, 100]. What is the correct way to build their interaction?

1 point

5. What is a correct way to get t-SNE projection of train and test data?

1 point

6. Is it possible to do t-SNE projection into 20-dimensional space?

1 point

Post Views:
1

Week – 3 Mean encodings >>> How to Win a Data Science Competition: Learn from Top Kagglers 1. What can be an indicator of usefulness of mean encodings? 1 point…

Week – 2 Validation >>> How to Win a Data Science Competition: Learn from Top Kagglers 1. Select true statements 1 point Performance increase on a fixed cross-validation split guaranties…

Week – 1 Feature preprocessing and generation with respect to models 1. Suppose we have a feature with all the values between 0 and 1 except few outliers larger than…

Week- 4 Ensembling >>> How to Win a Data Science Competition: Learn from Top Kagglers Programming Assignment: Ensembling implementation Click Here For Assignment 1. Suppose we are given…

Week – 2 Exploratory data analysis >>> How to Win a Data Science Competition Learn from Top Kagglers 1. Suppose we are given a data set with features XX, YY,…

All Assignment of How to Win a Data Science Competition Learn from Top Kagglers Week – 1 Assignment Programming Assignment: Pandas basics Click Here To Get Week – 2 Assignment…

error: Content is protected !!