The empirical literature on nominal exchange rates shows that the current exchange rate is often a better predictor of future exchange rates than a linear combination of macroeconomic fundamentals. This result is behind the famous Meese-Rogoff puzzle. In this paper we evaluate whether parameter instability can account for this puzzle. We consider a theoretical reduced-form relationship between the exchange rate and fundamentals in which parameters are either constant or time varying. We calibrate the model to data for exchange rates and fundamentals and conduct the exact same Meese-Rogoff exercise with data generated by the model. Our main finding is that the impact of time-varying parameters on the prediction performance is either very small or goes in the wrong direction. To help interpret the findings, we derive theoretical results on the impact of time-varying parameters on the out-of-sample forecasting performance of the model. We conclude that it is not time-varying parameters, but rather small sample estimation bias, that explains the Meese-Rogoff puzzle.