In many predictive modeling scenarios, the production set inputs that later will be used for the actual prediction is available and could be utilized in the modeling process. In fact, many predictive models are generated with an existing production set in mind. Despite this, few approaches utilize this information in order to produce models optimized on the production set at hand. If these models need to be comprehensible, the oracle coaching framework can be applied, often resulting in interpretable models, e.g., decision trees and rule sets, with accuracies on par with opaque models like neural networks and ensembles, on the specific production set. In oracle coaching, a strong but opaque predictive model is used to label instances, including the production set, which are later learned by a weaker but interpretable model. In this paper, oracle coaching is, for the first time, used for improving the calibration of probabilistic predictors. More specifically, setups where oracle coaching are combined with the techniques Platt scaling, isotonic regression and Venn-Abers are suggested and evaluated for calibrating probability estimation trees (PETs). A key contribution is the setup designs ensuring that the oracle-coached PETs, that per definition utilize knowledge about production data, remain well-calibrated. In the experimentation, using 23 publicly available data sets, it is shown that oracle-coached models are not only more accurate, but also significantly better calibrated, compared to standard induction. Interestingly enough, this holds both for the uncalibrated PETs, and for all calibration techniques evaluated, i.e., Platt scaling, isotonic regression and Venn-Abers. As expected, all three external techniques significantly improved the calibration of the original PETs. Finally, an outright comparison between the three external calibration techniques showed that Venn-Abers significantly outperformed the alternatives in most setups.