Skip to content

Conversation

@SamuelMarks
Copy link
Collaborator

Description

1632 may just be too big a file! - See src/MaxText/utils/ckpt_conversion/utils/param_mapping.py

There are two obvious ways of doing this refactor:

  1. Qwen, Gemma, Llama etc. files under big nouns like "checkpoint_conversion" or "multimodal_utils"
  2. Qwen, Gemma, Llama with the same folder/module layout each, so Qwen3/checkpoint_conversion.py,
    Qwen3/get_image_offsets.py, Qwen3/apply_embedding.py, &etc.

If you take the latter approach to its extreme, the models themselves can be removed from this repository. New models can be added privately (e.g., on NVIDIA's or AMD's fork), and the loose coupling enables them to take full advantage of new/improved functionality throughout MaxText.

Various mechanisms can be made to make the second approach type safe, like:

And the original 0,1 can be done in phases. With 0 done first and 1 done once the final refactor can be done trivially performed in any IDE or langserv.

Even keeping all the models in this repository has advantage, particularly when refactoring / adding new features / upgrading dependencies and ensuring consistency throughout (for quality and benchmarking purposes).

Tests

CI

Checklist

Before submitting this PR, please make sure (put X in square brackets):

  • I have performed a self-review of my code. For an optional AI review, add the gemini-review label.
  • I have necessary comments in my code, particularly in hard-to-understand areas.
  • I have run end-to-end tests tests and provided workload links above if applicable.
  • I have made or will make corresponding changes to the doc if needed, including adding new documentation pages to the relevant Table of Contents (toctree directive) as explained in our documentation.

…ersion following strategy pattern ; [tests/ckpt_conversion_param_mapping_strategy_test.py] Add test
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant