Skip to content

Conversation

@vkuzo
Copy link
Contributor

@vkuzo vkuzo commented Nov 5, 2025

Summary:

Adds warnings that the following configs will be moved to prototype in a
future release:

  • Int8DynamicActivationInt4WeightConfig
  • Int4DynamicActivationInt4WeightConfig
  • GemliteUIntXWeightOnlyConfig
  • Float8StaticActivationFloat8WeightConfig
  • UIntXWeightOnlyConfig
  • FPXWeightOnlyConfig

See #2752 for more context

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:

[ghstack-poisoned]
@vkuzo
Copy link
Contributor Author

vkuzo commented Nov 5, 2025

Stack from ghstack (oldest at bottom):

vkuzo added a commit that referenced this pull request Nov 5, 2025
Summary:

Adds warnings that the following configs will be moved to prototype in a
future release:
* `Int8DynamicActivationInt4WeightConfig`
* `Int4DynamicActivationInt4WeightConfig`
* `GemliteUIntXWeightOnlyConfig`
* `Float8StaticActivationFloat8WeightConfig`
* `UIntXWeightOnlyConfig`
* `FPXWeightOnlyConfig`

See #2752 for more context

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:
ghstack-source-id: 6eec9d6
ghstack-comment-id: 3490630687
Pull-Request: #3294
@pytorch-bot
Copy link

pytorch-bot bot commented Nov 5, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/ao/3294

Note: Links to docs will display an error until the docs builds have been completed.

✅ No Failures

As of commit e83eb83 with merge base 9266734 (image):
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@meta-cla meta-cla bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Nov 5, 2025
@vkuzo vkuzo added the topic: deprecation Use this tag if this PR deprecates a feature label Nov 5, 2025
[ghstack-poisoned]
vkuzo added a commit that referenced this pull request Nov 6, 2025
Summary:

Adds warnings that the following configs will be moved to prototype in a
future release:
* `Int8DynamicActivationInt4WeightConfig`
* `Int4DynamicActivationInt4WeightConfig`
* `GemliteUIntXWeightOnlyConfig`
* `Float8StaticActivationFloat8WeightConfig`
* `UIntXWeightOnlyConfig`
* `FPXWeightOnlyConfig`

See #2752 for more context

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:
ghstack-source-id: 548fca9
ghstack-comment-id: 3490630687
Pull-Request: #3294
"torchao.quantization.Int8DynamicActivationInt4WeightConfig"
)
warnings.warn(
"`Int8DynamicActivationInt4WeightConfig` will be moving to prototype in a future release of torchao. Please see https://github.com/pytorch/ao/issues/2752 for more details."
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I feel this config can just be removed in the future in favor of Int8DynamicActivationIntXWeightConfig. Also maybe add the replacement in the warning?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree long term, I want to prioritize moving these out of main folder more urgently to clean up the main folder, and deleting outright is less urgent. If someone wants to own deleting outright on a tight timeline, sounds good to me!

# Each call should have at least one warning.
# Some of them can have two warnings - one for deprecation,
# one for moving to prototype
self.assertTrue(len(_warnings) > 0)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we want to assert len(_warnings) == 1 here, since we moved the warnings context manager outside the loop? E.g. if it's 2 then that means we've logged a warning each time we call the API, which could be very noisy

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I can fix this, we can assert for length 1 to 2. The 2nd warning is the one being added in this PR.

"torchao.quantization.Int4DynamicActivationInt4WeightConfig"
)
warnings.warn(
"`Int4DynamicActivationInt4WeightConfig` will be moving to prototype in a future release of torchao. Please see https://github.com/pytorch/ao/issues/2752 for more details."
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we know if anyone's actually using these configs? Is it moving to prototype just in case someone is still using them?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Moving to prototype is to clean up main folder faster, and we can delete from prototype at a later time. Note that there are some internal use cases using some of these configs.

[ghstack-poisoned]
vkuzo added a commit that referenced this pull request Nov 6, 2025
Summary:

Adds warnings that the following configs will be moved to prototype in a
future release:
* `Int8DynamicActivationInt4WeightConfig`
* `Int4DynamicActivationInt4WeightConfig`
* `GemliteUIntXWeightOnlyConfig`
* `Float8StaticActivationFloat8WeightConfig`
* `UIntXWeightOnlyConfig`
* `FPXWeightOnlyConfig`

See #2752 for more context

Test Plan:

Reviewers:

Subscribers:

Tasks:

Tags:
ghstack-source-id: 2568e85
ghstack-comment-id: 3490630687
Pull-Request: #3294
@vkuzo vkuzo merged commit 1fbc364 into main Nov 7, 2025
50 checks passed
namgyu-youn pushed a commit to namgyu-youn/ao that referenced this pull request Nov 21, 2025
* Update

[ghstack-poisoned]

* Update

[ghstack-poisoned]

* Update

[ghstack-poisoned]
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. topic: deprecation Use this tag if this PR deprecates a feature

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants