In version 12, the CyberSEO Pro plugin introduces a major enhancement to AI integration – full support for custom AI endpoints and a much improved unified engine format. These features are also available in the plugin’s lightweight counterpart, RSS Retriever, making both tools more powerful and flexible than ever before. While the engine
parameter previously existed in the [gpt_article]
shortcode with a limited set of fixed values, it’s now available throughout the entire plugin – including the UI – and supports custom input in a free-form structure. This gives you unprecedented flexibility in selecting and connecting to models.
Custom AI Endpoints
On the “Accounts” page in the plugin settings, you’ll now find a new tab called Custom AI endpoints. It’s exactly what it sounds like – a place where you can register your own API endpoints for use with CyberSEO Pro.
Each endpoint requires:
- A unique ID (this becomes the provider part of your engine string)
- The endpoint URL (e.g.
https://api.mistral.ai/v1/chat/completions
) - An API key (if required by the provider)
- An optional default model ID
Once you’ve added a custom endpoint and clicked Update settings, it becomes available throughout the plugin – in the rewriter, auto-comments, translation tools, shortcodes, and any field that uses an engine string.
Let’s walk through a real-world example. Suppose you want to try out the Mistral API – an OpenAI-compatible service. You take the endpoint URL, assign it the ID mistral
, and if you want to use the mistral-small-latest
model, your engine string becomes:
mistral-mistral-small-latest
You can now use this string in the plugin UI or in shortcodes like this:
[ai_generate engine="mistral-mistral-small-latest" prompt="Summarize this text: %post_content_notags%"]
The [custom_ai] shortcode
This shortcode is made specifically for custom endpoints. It works like the others, but lets you separate the endpoint ID and model ID explicitly:
[custom_ai id="mistral" model="mistral-small-latest" prompt="Identify yourself"]
If you’ve set a default model in the endpoint settings, the model
parameter becomes optional.
More details on how to manage your custom endpoints are here: Custom AI endpoints.
If you’re using this setup in other plugin features like rewriter, auto-comments, or translation, the corresponding engine value would be:
mistral-mistral-small-latest
The engine format
A key change in this release is how AI models are identified across the plugin. Instead of selecting from a predefined list, you now use a single engine parameter. This lets you specify both the provider (or custom endpoint ID) and the model in one string.
Examples:
openai-gpt-4o
– OpenAImeta-llama/llama-3.1-405b-instruct
– OpenRoutermistral-mistral-small-latest
– Your custom Mistral endpoint
The syntax is simple and unified. Full documentation is available here: AI engine.
Universal [ai_generate] shortcode
The [ai_generate]
shortcode is now the universal way to call any model from any provider. It accepts engine
as a required parameter and supports all common OpenAI-style parameters (temperature
, max_tokens
, top_p
, etc.).
Example:
[ai_generate engine="openai-gpt-4o" prompt="Summarize this text: %post_content_notags%"]
If you’re experimenting with new providers or self-hosted models, this is the shortcode to use.
Plugin UI updates
Every setting in the plugin that used to ask for a model name now expects a full engine string. Every setting in the plugin that used to ask for a model name now expects a full engine string. Don’t worry – the new Quick Select drop-down list lets you pick from common built-in models. You can also enter your engine name manually using the correct format.
This update makes CyberSEO Pro the most flexible autoblogging plugin on the market. You’re no longer tied to a fixed list of AI providers. If the API is designed in the standard of the OpenAI – you will be able to use it.
Create content your way, with the models you choose.
Source: https://www.cyberseo.net/blog/connect-any-ai-api-with-custom-endpoints-in-cyberseo-pro/