Plugin Author
胡洪刚
(@gwuluo)
Hello,
What error message is appearing? How can I assist you?
I’m using OpenRouter. After configuration, the translation function works fine, but the AI is running slowly.
Hi,
I’ve noticed that even when the HHG for TranslatePress plugin is configured with a valid OpenAI API key, endpoint, and model (for example https://api.openai.com/v1/chat/completions with gpt-4o-mini), the plugin doesn’t actually send requests to OpenAI.
All translation requests, including the “Test API Credentials” action, are still sent to:
https://mtapi.translatepress.com/translations
instead of the OpenAI endpoint.
Because of that, my OpenAI key is never used, and the translations come from the TranslatePress API instead.
Expected behavior:
When an OpenAI key and endpoint are set, the plugin should send translation requests directly to OpenAI, respecting the model and temperature parameters.
Observed behavior:
- The debug log always shows requests to
mtapi.translatepress.com.
- The response returns a static example (
about → acerca de), confirming it’s not coming from OpenAI.
Temporary workaround:
I edited the file
/wp-content/plugins/hhg-for-translatepress/includes/api.php
and replaced the endpoint with
https://api.openai.com/v1/chat/completions,
which fixed the issue and allowed the plugin to communicate with OpenAI directly.
Could you please confirm if this is a known issue or if there’s a new version planned that restores direct OpenAI integration?
Thanks a lot for your work on this plugin — I’d be happy to test any update or beta version.
Best,
David
Plugin Author
胡洪刚
(@gwuluo)
Yes, I noticed it. I have recently fixed this problem and optimized the speed. I will release it after I finish testing.