Conversation
There was a problem hiding this comment.
Pull request overview
Adds optional TimeGPT fine-tuning support by introducing a configuration object that, when provided, forwards fine-tuning parameters to the underlying NixtlaClient.forecast() call.
Changes:
- Added
TimeGPTFinetuningConfigdataclass to encapsulate fine-tuning parameters. - Extended
TimeGPTto acceptfinetuning_configand forward fine-tune kwargs duringforecast(). - Updated API docs to include
TimeGPTFinetuningConfigin generated documentation.
Reviewed changes
Copilot reviewed 2 out of 3 changed files in this pull request and generated 2 comments.
| File | Description |
|---|---|
timecopilot/models/foundation/timegpt.py |
Adds a finetuning config object and forwards fine-tuning parameters into the TimeGPT forecast call path. |
docs/api/models/foundation/models.md |
Exposes the new config type in the foundation models API docs. |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| """ | ||
| freq = self._maybe_infer_freq(df, freq) | ||
| client = self._get_client() | ||
| finetune_kwargs: dict = {} |
There was a problem hiding this comment.
finetune_kwargs is typed as a bare dict, which loses key/value typing and is inconsistent with other finetuning wrappers (e.g., Chronos uses dict[str, Any] for kwargs). Consider typing this as dict[str, Any] (and importing Any) so mypy/IDE tooling can catch mistakes in forwarded parameters.
| if self.finetuning_config is not None: | ||
| finetune_kwargs["finetune_steps"] = self.finetuning_config.finetune_steps | ||
| finetune_kwargs["finetune_loss"] = self.finetuning_config.finetune_loss | ||
| finetune_kwargs["finetune_depth"] = self.finetuning_config.finetune_depth |
There was a problem hiding this comment.
The new finetuning forwarding logic isn’t covered by tests. Since TimeGPT can’t be exercised in CI without live API calls, consider adding a unit test that mocks NixtlaClient.forecast to assert: (1) finetune_steps/loss/depth are passed when finetuning_config is set, and (2) no finetune kwargs are passed when it’s None. This will catch parameter name regressions without making network requests.
this pr adds timegpt finetunning.