Neural Networks 15 Essential Q&A
Interview Prep

TensorFlow / Keras — 15 Interview Questions

Eager execution, tf.Tensor, Keras layers, compile/fit/evaluate, tf.data, callbacks, and deployment basics.

Colored left borders per card; green / amber / red difficulty chips.

Tensor Keras tf.data SavedModel
1 TensorFlow 2 default execution mode.Easy
Answer: Eager by default—ops run immediately like NumPy; @tf.function traces graphs for speed and export.
2 tf.Tensor vs variable.Easy
Answer: Tensors are immutable values in the graph; tf.Variable holds trainable state updated by optimizers.
3 Sequential vs Functional API.Easy
Answer: Sequential—single input/output stack. Functional—arbitrary DAG: shared layers, skips, multi-input/output.
4 What does compile() configure?Easy
Answer: Optimizer, loss (string or callable), metrics—prepares training; no weights updated until fit.
5 fit() vs custom training loop.Medium
Answer: fit handles epoch/batch loop, metrics, callbacks—fast to use. Custom loop with GradientTape when you need fine control (GANs, research).
6 tf.GradientTape.Medium
Answer: Records forward ops on watched variables; tape.gradient(loss, vars) for backward—use persistent=True if multiple gradients from same tape.
7 tf.data.Dataset benefits.Medium
Answer: Pipelines for prefetch, map, shuffle, batch—overlaps CPU prep with GPU; scales to large data.
8 Common Keras callbacks.Easy
Answer: EarlyStopping, ModelCheckpoint, TensorBoard, ReduceLROnPlateau—hooks into training loop without rewriting fit.
9 Training vs inference in Keras.Medium
Answer: In fit, Keras sets training mode for BN/Dropout; in custom calls use training=True/False. evaluate/predict use inference—mismatch causes misleading metrics.
10 sparse_categorical_crossentropy vs categorical_crossentropy.Medium
Answer: Sparse uses integer labels; categorical expects one-hot. Both typically pair with logits via from_logits=True.
11 H5 / Keras file vs SavedModel.Medium
Answer: SavedModel is TF’s preferred format for serving (TF Serving), TF Lite, portability; H5 stores architecture+weights mainly in Python ecosystem.
12 MirroredStrategy (one line).Hard
Answer: Data-parallel multi-GPU on one machine—replicas sync gradients; wrap model creation and compile inside strategy scope.
13 TensorFlow Lite—when?Medium
Answer: On-device / edge inference—smaller binary, quantization; convert from SavedModel with TFLite converter.
14 @tf.function retracing.Hard
Answer: New concrete input shapes or Python side effects can trigger retrace—costly; use consistent tensor shapes or input_signature where possible.
15 Keras 3 / multi-backend (high level).Easy
Answer: Keras 3 can target TF, JAX, or PyTorch as backends—same API idea; interview: know TF+Keras is still dominant in many production shops.
Mention GradientTape + custom step if they ask “beyond fit.”

Quick review checklist

  • Eager vs tf.function; Variable; Sequential vs Functional.
  • compile/fit; GradientTape; tf.data; callbacks.
  • Sparse vs categorical loss; SavedModel; TFLite / MirroredStrategy sketch.