What are feature vectors?

Instruction: Define feature vectors and explain their importance in machine learning models.

Context: This question tests the candidate's understanding of the role of feature vectors in machine learning algorithms.

Example Answer

The way I'd explain it in an interview is this: A feature vector is the numerical representation of one training example that a model actually consumes. It takes the raw input, such as text, image signals, tabular values, or engineered attributes, and expresses it as a set of numeric features in a consistent format.

The reason feature vectors matter is that the model only learns from what is encoded there. If important information is missing, badly transformed, or misaligned, the model cannot recover it on its own. So when I think about a feature vector, I think less about the shape of the array and more about whether it captures the information needed for the task clearly, consistently, and at inference time as well as at training time.

Common Poor Answer

A weak answer says feature vectors are arrays of numbers, without explaining that they are the model's actual representation of the example and therefore shape what the model can learn.

Related Questions