Share this post on:

So I think I have worked out a way to solve the imputation problem. What I have done is written an iterative function that searches for blocks of missing data and uses previous complete data passed to the TIM API to predict what the missing values should be. It isn’t ideal but it is the best solution that I can find to intelligently use the TIM API to fill in the blanks.

This is possible because of one of the unique features of TIM, and that is that it utilises a branch of Mathematics called Information Geometry to build the best model it can by taking care of the feature engineering and other automation settings and creates what it calls a Real-Time Instant ML model, which it discards as soon as it is used. That means that instead of having to manually rebuild the model for each iteration of missing data, a call is made to the API that keeps feeding a more complete dataset which TIM then gives predictions back for. Of course, instant doesn’t mean instant, but an iteration can take as little as 30 seconds to complete as opposed to keeping rebuilding a manual model.

The downside to this approach is that inaccuracies early on could be passed into future models with a compounding effect.

Share this post on:

Leave a Comment

Your email address will not be published. Required fields are marked *