Description from extension meta
Optimizes ChatGPT performance on long conversations via DOM windowing and lazy loading.
Image from store
Description from store
Make long ChatGPT threads fast and predictable.
ChatGPT Optimizer keeps the interface snappy by reducing live DOM size and loading older messages in fixed batches. The result: smoother scrolling, fewer layout spikes, and lower memory use on big conversations.
Why it’s faster
DOM windowing: keep ~50 recent messages in the DOM; older nodes are hidden to reduce style and layout work.
Batch lazy-load: reveal older messages in fixed steps (e.g., 25 at a time) for bounded, predictable work.
Observer orchestration: MutationObserver (subtree) keeps counts accurate; optional IntersectionObserver loads one batch per distinct scroll—no cascade.
Typical results (internal tests, 300–600 messages)
×5–×15 fewer live DOM nodes
−20–45% layout/recalc time
−25–50% JS heap memory
(Actual results vary by device, theme, and content.)
Features
Window and batch size controls (e.g., Window 50, Batch 25)
Clear “Load 25 more” prompt at the hidden/visible boundary
Optional one-batch-per-scroll (off by default)
Stats panel: total / visible / hidden / initialized
Privacy & permissions
No tracking. No external servers.
Works only on chat.openai.com and chatgpt.com.
Permissions: storage (preferences), tabs (detect ChatGPT tab), optional declarativeNetRequest rules.
Disclaimer
Not affiliated with OpenAI. “ChatGPT” is a trademark of its owner.
Statistics
Installs
28
history
Category
Rating
5.0 (1 votes)
Last update / version
2025-08-15 / 1.0.6
Listing languages
en