A powerful AI assistant for Magento 2 that helps you interact with your store's data using natural language queries.
- Convert natural language questions into SQL queries
- Execute SELECT and DESCRIBE queries safely
- No DDL and DML commands are allowed by default however you can change it
- Open chat in new window
- Real-time token usage tracking
- Cost calculation based on model type
- Detailed statistics for:
- Current message tokens (prompt, completion, total)
- Session cumulative tokens
- Cost breakdown per request
- Total session cost
- Support for different OpenAI models with respective pricing:
- GPT-3.5 Turbo
- GPT-4
- GPT-4 Turbo
- GPT-4 32k
- Persistent conversation history
- Session-based context maintenance
- Automatic session cleanup
- Session ID tracking for debugging
- API key configuration
- Query validation
- Safe SQL execution
- Session-based authentication
- Clear error messages
- Automatic error fixing suggestions
- SQL error analysis
- Table structure inspection
- Decoupled OpenAI service for better maintainability
- Clean separation of concerns
- Extensible design
- Easy to customize and extend
- AI-powered customer-facing chatbot for your storefront
- Dual theme support (Hyva & Standard Magento)
- Response caching for common questions
- Customizable interface and suggested questions
- Mobile responsive design
- AI-powered product assistant in the admin panel
- Retrieval Augmented Generation (RAG) for accurate product information
- Maintains conversation history for context
- Displays formatted content with images and links
- Processes Markdown formatting for better readability
- Stop words removal for optimized token usage
- Token usage tracking and statistics
- Responsive chat interface with modern styling
- Install the module using Composer:
composer require genaker/magento-mcp-ai
- Enable the module:
bin/magento module:enable Genaker_MagentoMcpAi
- Run setup upgrade:
bin/magento setup:upgrade
- Clear cache:
bin/magento cache:clean
- Navigate to Stores > Configuration > Genaker > Magento MCP AI
- Enter your OpenAI API key
- Save configuration
- Configure custom rules for the AI assistant
- Define query generation behavior
- Set response formatting rules
- Default Model: Set the default AI model in
Stores > Configuration > Genaker > Magento MCP AI > General Configuration > Default AI Model
. - Custom Model (Override): If you set a value in the
Custom Model (Override)
field, this model will be used for all AI requests, overriding the default model. Leave it blank to use the default model. - Priority Order:
- If
Custom Model (Override)
is set, it is always used. - If not, the
Default AI Model
is used. - If both are empty, the fallback is
gpt-3.5-turbo
.
- If
Example:
- Set
Custom Model (Override)
tomistral-7b-instruct
to use that model for all requests, even if the default is set togpt-4
. - Leave
Custom Model (Override)
blank to use the default model selection logic.
- Add store-specific documentation
- Include table structures
- Document custom attributes
- Configure custom database connection in
app/etc/env.php
:
'db' => [
'ai_connection' => [
'host' => 'your_host',
'dbname' => 'your_database',
'username' => 'your_username',
'password' => 'your_password'
]
]
- you can add aditional read only connection or with user has read only prevelegy:
To create a read-only MySQL user for Magento, follow these steps:
- Connect to MySQL as root:
mysql -u root -p
- Create a new read-only user (replace
username
andpassword
with your desired values):
CREATE USER 'username'@'localhost' IDENTIFIED BY 'password';
- Grant read-only privileges to the Magento database (replace
magento_db
with your database name):
GRANT SELECT ON magento_db.* TO 'username'@'localhost';
- For remote access (if needed), create the user with host '%':
CREATE USER 'username'@'%' IDENTIFIED BY 'password';
GRANT SELECT ON magento_db.* TO 'username'@'%';
- Flush privileges to apply changes:
FLUSH PRIVILEGES;
- Verify the user's privileges:
SHOW GRANTS FOR 'username'@'localhost';
Add the read-only user credentials to your app/etc/env.php
file:
'db' => [
'connection' => [
'default' => [
'host' => 'localhost',
'dbname' => 'magento_db',
'username' => 'readonly_user',
'password' => 'your_password',
'model' => 'mysql4',
'engine' => 'innodb',
'initStatements' => 'SET NAMES utf8;',
'active' => '1'
],
'ai_connection' => [
'host' => 'localhost',
'dbname' => 'magento_db',
'username' => 'readonly_user',
'password' => 'your_password',
'model' => 'mysql4',
'engine' => 'innodb',
'initStatements' => 'SET NAMES utf8;',
'active' => '1'
]
]
]
- Always use strong passwords
- Restrict access to specific IP addresses if possible
- Regularly audit user privileges
- Consider using SSL for remote connections
- Monitor database access logs
- Navigate to Marketing > AI Assistant > MCP AI Assistant
- Or use the direct URL:
/admin/magentomcpai/chat/index
- Type your question in natural language
- The assistant will convert it to SQL
- View results in the table below
- Export results to CSV if needed
- Use the "Clear" button to start a new conversation
- Export chat history using "Save Chat"
- Open chat in new window for better visibility
- View real-time token statistics in the expandable panel
- Track costs for each interaction
- Monitor cumulative session usage
- See breakdown by:
- Prompt tokens and cost
- Completion tokens and cost
- Total tokens and cost
- Costs automatically calculated based on selected model < F438 /ul>
- If a query fails, click "Fix in Chat"
- The assistant will analyze the error
- It may suggest checking table structures
- Follow the suggested fixes
- Navigate to Marketing > AI Assistant > Product Chat
- Ask questions about your product catalog
- The assistant uses RAG (Retrieval Augmented Generation) to provide accurate information
- View product images and details directly in the chat
- Get formatted responses with proper styling
- Maintain conversation context for follow-up questions
- Track token usage for each interaction
- Handles all API communication
- Manages request formatting
- Processes responses
- Handles error cases
- Provides consistent response format
- Better separation of concerns
- Easier to test and maintain
- More flexible for customization
- Cleaner code organization
- Simplified error handling
- Monitors API usage in real-time
- Calculates costs based on current model
- Maintains session statistics
- Provides detailed usage breakdown
- Supports all OpenAI model pricing tiers
- Uses a products.md file as the knowledge base
- Removes stop words to optimize token usage
- Enhances responses with accurate product information
- Supports multiple languages for stop words removal
- Properly formats product details with images and links
- Maintains conversation history for contextual responses
- Tracks token usage including cached tokens
- Navigate to Stores > Configuration > Genaker > Magento MCP AI
- In the "AI Rules" field, add your custom rules
- Each rule should be on a new line
- Rules will override the default system message
- Navigate to Stores > Configuration > Genaker > Magento MCP AI
- In the "Documentation" field, add your store's specific information
- Include:
- Table structures and relationships
- Custom attributes and their usage
- Business logic and rules
- Common query patterns
- Be specific and clear in your rules
- Use consistent formatting
- Include examples where helpful
- Prioritize security rules
- Use clear headings for each section
- Include field types and constraints
- Document relationships between tables
- Add examples of common queries
- Keep documentation up to date
- Review and update rules regularly
- Monitor query performance
- Adjust based on user feedback
- Test new rules thoroughly
- Validate query results
- Check performance impact
- Monitor error rates
- Verify API key in configuration
- Check API key permissions
- Ensure proper formatting
- Use the "Fix in Chat" feature
- Check table structures
- Verify column names
- Review SQL syntax
- Clear conversation history
- Check database connection
- Monitor API usage in the
- Optimize queries
- Email: egorshitikov@gmail.com
- Fork the repository
- Create your feature branch
- Commit your changes
- Push to the branch
- Create a Pull Request
- Recognize objects, labels, and text in images
- Multiple detection types in a single request:
- LABEL_DETECTION: Identify objects, locations, activities
- TEXT_DETECTION: Extract text from images
- OBJECT_LOCALIZATION: Find and locate objects with bounding boxes
- DOCUMENT_TEXT_DETECTION: OCR optimized for dense text
- AI analysis of recognized content
- Comprehensive response formatting with confidence scores
- Support for handwritten text detection
- Convert audio files to text transcripts
- Multiple audio format support (LINEAR16, MP3, etc.)
- Language detection and multi-language support
- Automatic punctuation insertion
- Confidence scoring for transcription accuracy
- AI analysis of transcribed content
- Upload files to OpenAI for analysis and reference
- Support for multiple file formats (PDF, TXT, DOCX, etc.)
- Batch upload for multiple files
- File-based question answering
- Multi-file analysis and comparison
- Comprehensive error handling with detailed messages
- Assistants API integration for complex file operations
- Generate vector embeddings from text for semantic similarity comparison
- Supports state-of-the-art embedding models (text-embedding-ada-002)
- Used for semantic search, content recommendations, and similarity matching
- Handles both single text inputs and batch processing in a unified way
- Building semantic search systems
- Classifying content by similarity
- Recommending products based on descriptions
- Clustering similar content
- Create custom product images, promotional materials, and marketing content
- Support for both DALL-E 2 and DALL-E 3 models
- Adjustable image sizes, quality, and style
- Options for vivid or natural aesthetics
- Perfect for:
- Creating product imagery without professional photography
- Generating promotional banners and social media content
- Visualizing custom product configurations
- Creating lifestyle images showing products in use
- Seamless workflow between different media types
- Extract text from images and analyze with AI
- Transcribe audio and generate intelligent responses
- Combine file analysis with image recognition
- Multi-modal processing pipeline support
- Combines the power of LLMs with cached results for efficiency and accuracy
- Reduces API usage costs by reusing previous responses for similar queries
- Improves response speed for common or repeated questions
- Perfect for:
- Product information retrieval scenarios
- Customer service FAQ responses
- Category description generation
- Order status explanations
- Customer Service Chatbots: Cache common product questions and support responses
- Product Description Generation: Store and reuse stylistically similar descriptions
- Search Query Understanding: Cache interpretations of similar search queries
- Content Recommendations: Store previously generated recommendations for similar user profiles
- Significantly reduced API costs for repetitive operations
- Lower latency for customer-facing AI features
- Consistent responses for similar queries
- Ability to handle higher request volumes without throttling
- Fine-tuned control over when to generate new content vs. use cached responses
- Checking if a similar query exists in the cache
- Retrieving and adapting cached responses when appropriate
- Only calling the AI API for truly novel requests
- Gradually building a knowledge repository specific to your store
- Enhances AI responses with real-time data retrieved from your Magento database and documents
- Combines the reasoning capabilities of LLMs with factual, store-specific information
- Ensures AI responses reflect your current product catalog, inventory, and policies
- Reduces hallucinations and increases accuracy of AI-generated content
- Product Information Assistants: Provide accurate, up-to-date product details by retrieving data from your catalog
- Order Support: Pull real customer order data to answer shipping, return, and order status questions
- Policy Compliance: Ground responses in your actual store policies by retrieving relevant documentation
- Inventory-Aware Recommendations: Suggest alternatives based on current stock levels
- Customer query is analyzed to identify information needs
- Relevant data is retrieved from your Magento database or document store
- Retrieved information is fed to the LLM as context alongside the query
- AI generates a response that incorporates the retrieved facts
- Result is an answer that's both helpful and factually accurate about your specific store
- Up-to-date information: Responses reflect your current catalog and pricing
- Store-specific accuracy: AI answers reference your actual policies and procedures
- Reduced hallucinations: Minimizes AI tendency to generate plausible but incorrect information
- Custom knowledge: Incorporates your unique product knowledge and business rules
- Better customer experience: More precise and trustworthy assistance
- Large or frequently changing product catalogs
- Complex product specifications
- Custom policies or shipping rules
- Need for personalized customer service at scale
- Enables AI to call specific Magento functions based on user requests
- Bridges natural language requests with structured API calls
- Perfect for enabling AI assistants to take actions within your store
- Provides structured data extraction from natural language queries
- Define functions that represent actions in your Magento store
- The AI analyzes user requests to determine which function to call
- AI extracts parameters from natural language and formats them correctly
- Your application receives structured function calls rather than raw text
- Execute the function with the provided parameters and return results
- Product Search: Convert natural language to structured search parameters
- Order Operations: Process return requests, order lookups, or shipping changes
- Customer Account Management: Update preferences, addresses, or subscriptions
- Inventory Queries: Check stock levels or availability based on conversational requests
- Convert spoken content into text with high accuracy
- Support for multiple output formats (JSON, text, SRT, VTT)
- Language detection and multi-language support
- Perfect for:
- Transcribing customer service calls
- Converting video tutorials into text content
- Creating accessible content for product videos
- Processing voice notes from sales teams
- Generate natural-sounding speech from text
- Multiple voice options (alloy, echo, fable, onyx, nova, shimmer)
- Support for various audio formats (MP3, Opus, AAC, FLAC)
- Adjustable speech speed for different use cases
- Perfect for:
- Creating audio product descriptions
- Adding voice prompts to your store interface
- Building voice notifications for order status
- Making content accessible to visually impaired customers
- Create a Google Cloud project
- Enable Vision API and Speech-to-Text API
- Create service account credentials
- Generate access token using:
- Configure OpenAI API key in admin panel
- Set appropriate file size limits
- Configure allowed file types:
- All media files are processed with strict validation
- No permanent storage of sensitive data
- Temporary file handling with secure cleanup
- Rate limiting for API requests
- Access control based on admin permissions
- Audit logging for all media processing operations
- Dual Theme Support: Works with both Hyva themes (Tailwind CSS & Alpine.js) and standard Magento themes
- AI-Powered Responses: Leverages OpenAI's models to provide intelligent answers
- Response Caching: Caches common questions for faster performance
- Suggested Questions: Displays customizable question prompts to guide customers
- Mobile Responsive: Fully responsive design works on all devices
- Navigate to Stores > Configuration > Genaker > Magento MCP AI > Customer Chatbot Configuration
- Configure the following settings:
- Enable Customer Chatbot: Turn the chatbot on/off
- Theme Type: Choose between Standard Magento or Hyva Theme
- Chatbot Title: Set the title displayed in the header
- Welcome Message: Customize the initial message
- AI Model: Select which AI model to use for responses
- Suggested Queries: Define common questions to display
- Chatbot Logo: Upload a custom logo (64x64px recommended)
- Frequently Asked Questions: Add common Q&A to cache
- Click to open the chat interface
- Select from suggested questions or type their own
- Receive AI-powered responses about products, policies, etc.
- Maintain conversation history during their session
- Uses the same OpenAI integration as the admin MCP AI Assistant
- Implements caching to reduce API costs for common questions
- Supports both Hyva and standard Magento themes through modular design
- Includes REST API endpoints for headless implementation
- Location: The configuration can be found in the Magento admin panel under
Stores > Configuration > Genaker > Magento MCP AI > General Configuration
. - Default Value: Disabled (set to
No
). - How to Enable: To enable sending data to AI, set the "Send Magento Data to AI" option to
Yes
. - Generating SQL queries for data analysis.
- Providing recommendations based on historical data.
- Enhancing decision-making processes with AI-driven insights.
- Security: Ensure that sensitive data is handled securely and that only necessary data is shared with the AI.
- Data Storage: Currently, the system does not store SQL data. Future updates may include options for storing or managing this data.
- Always review the data being sent to ensure compliance with data privacy regulations.
- Consider the implications of sharing data with external services and ensure that appropriate security measures are in place.
- Go to Magento Admin:
- Navigate to
Stores > Configuration > Genaker > Magento MCP AI > General Configuration
.
- Navigate to
- Find the "AI API Domain" field:
- Default:
https://api.openai.com
- You can set this to any OpenAI-compatible endpoint, e.g.:
- Official:
https://api.openai.com
- Local server:
http://localhost:8000
- Proxy:
https://your-proxy.example.com
- OpenRouter:
https://openrouter.ai/api/v1
- Official:
- Default:
- Save the configuration and flush cache.
- Local Development:
- Remote/Proxy AI Engines:
- Use a proxy or gateway that implements the OpenAI API (e.g., OpenRouter, Azure OpenAI, custom reverse proxy)
- Set the domain to your proxy URL
- Alternative Providers:
- Some providers (e.g., OpenRouter, Together, Replicate) offer OpenAI-compatible APIs. Set the domain to their endpoint.
- API Key: Make sure to use the correct API key for the chosen endpoint. Some providers require different keys.
- Model Names: Not all endpoints support all OpenAI models. Check the documentation for supported models.
- SSL/TLS: For production, use HTTPS endpoints. For local development, HTTP is acceptable.
- CORS/Firewall: Ensure your Magento server can reach the configured domain (check firewall, CORS, and network settings).
- Testing: After changing the domain, test the AI assistant to ensure connectivity and compatibility.
- If you see connection errors, verify the endpoint URL and that the server is reachable from your Magento instance.
- Check logs for detailed error messages.
- Some self-hosted engines may require additional configuration (e.g., model downloads, API enablement).
- Start LocalAI on your server:
local-ai --models-path /path/to/models --api-bind 0.0.0.0:8080
- In Magento admin, set AI API Domain to:
http://localhost:8080
- Use a compatible model name (e.g.,
gpt-3.5-turbo
if supported by your engine). - Get your OpenRouter API key from https://openrouter.ai/
- Set AI API Domain to:
https://openrouter.ai/api/v1
- Use your OpenRouter API key in the API Key field.
- OpenAI — The official provider for GPT-3.5, GPT-4, DALL-E, Whisper, etc.
- Azure OpenAI Service — Microsoft Azure's managed OpenAI API, supports enterprise features and regional hosting.
- OpenRouter — API gateway for multiple models (OpenAI, Anthropic, Google, Mistral, etc.) with a unified OpenAI-compatible API.
- Together AI — Cloud provider for open-source LLMs (Mixtral, Llama, MPT, etc.) with OpenAI-compatible endpoints.
- Groq — High-speed inference for Llama and Mixtral models, OpenAI API compatible.
- Replicate — Offers a wide range of models (vision, language, etc.) via an OpenAI-compatible API.
- Perplexity AI — Provides OpenAI-compatible endpoints for their models.
- LocalAI — Self-hosted OpenAI API for running LLMs, Whisper, and more locally or on your own server.
- Ollama — Local LLM runner with OpenAI-compatible API, supports models like Llama, Mistral, Phi, etc.
- LM Studio — Desktop app for running and chatting with local LLMs, exposes an OpenAI-compatible API.
- llama.cpp server — Lightweight C++ LLM server with OpenAI-compatible endpoints.
- FastChat — Multi-model, multi-user chat server with OpenAI API compatibility.
- vLLM — High-throughput LLM inference engine with OpenAI-compatible API.
- Not all providers support every OpenAI model or feature (e.g., function calling, images, audio). Check their documentation for supported endpoints and models.
- Some providers require special API keys or authentication methods.
- For local/self-hosted engines, you may need to download models and configure the server before use.
The module now uses a dedicated OpenAI service class (OpenAiService
) that:
Benefits:
The token tracking system:
The Retrieval Augmented Generation system:
The AI assistant uses a customizable system message to define its behavior. You can modify this in the admin configuration:
You are a SQL query generator for Magento 2 database. Your role is to assist with database queries while maintaining security. Rules:
1. Generate only SELECT or DESCRIBE queries
2. Validate and explain each generated query
3. Start responses with SQL in triple backticks: ```sql SELECT * FROM table; ```
4. Reject any non-SELECT/DESCRIBE queries
5. Maintain conversation context for better assistance
6. Provide clear explanations of query results
Add your own rules in the admin configuration:
Example custom rules:
- Always include table aliases in queries
- Explain the purpose of each JOIN
- Provide alternative query suggestions
- Include performance considerations
Add store-specific documentation to improve query accuracy:
Example documentation:
Table: sales_order
- Contains order information
- Key fields: entity_id, increment_id, customer_id
- Related tables: sales_order_item, sales_order_address
Custom Attributes:
- product_custom_type: string, values: 'simple', 'configurable', 'bundle'
- order_priority: integer, values: 1-5
For support, please contact:
This module is licensed under the MIT License.
// Basic image recognition
$imageData = $openAiService->recognizeImage(
'/path/to/image.jpg',
$googleAccessToken
);
// Extract text from documents (OCR)
$textData = $openAiService->extractTextFromImage(
'/path/to/document.jpg',
$googleAccessToken
);
// AI analysis of image content
$analysis = $openAiService->recognizeImageWithAiAnalysis(
'/path/to/image.jpg',
$googleAccessToken,
$openAiApiKey,
'Describe what you see in this image. Labels detected: {{LABELS}}. Text found: {{TEXT}}.'
);
// Basic speech-to-text conversion
$transcript = $openAiService->speechToText(
'/path/to/audio.mp3',
$googleAccessToken,
'en-US',
'MP3',
44100
);
// Speech transcription with AI analysis
$analysis = $openAiService->speechToTextWithAiResponse(
'/path/to/audio.mp3',
$googleAccessToken,
$openAiApiKey,
"Please summarize this transcription: ",
'gpt-3.5-turbo',
'en-US'
);
// Upload a single file
$fileData = $openAiService->uploadFile(
'/path/to/document.pdf',
'assistants',
$openAiApiKey
);
// Upload multiple files in batch
$batchResults = $openAiService->batchUploadFiles(
['/path/to/doc1.pdf', '/path/to/doc2.pdf'],
'assistants',
$openAiApiKey
);
// Ask questions about a file
$answer = $openAiService->getFileAnswers(
'What is the main topic discussed in this document?',
$fileData['id'],
$openAiApiKey
);
// Compare information across multiple files
$comparison = $openAiService->getFileAnswers(
'What are the key differences between these documents?',
[$fileId1, $fileId2, $fileId3],
$openAiApiKey
);
curl https://api.openai.com/v1/embeddings \
-H "Content-Type: application/json" \
-H "Authorization: Bearer YOUR_API_KEY" \
-d '{
"input": "I love cats",
"model": "text-embedding-ada-002"
}'
curl https://api.openai.com/v1/embeddings \
-H "Content-Type: application/json" \
-H "Authorization: Bearer YOUR_API_KEY" \
-d '{
"input": ["I love cats", "I love dogs", "Pets are great"],
"model": "text-embedding-ada-002"
}'
The embeddings API returns vector representations of text that can be used to measure semantic similarity between pieces of text. This is useful for:
curl https://api.openai.com/v1/images/generations \
-H "Content-Type: application/json" \
-H "Authorization: Bearer YOUR_API_KEY" \
-d '{
"prompt": "Promotional image of a high-security gray TL-30 burglary safe on orange background with price $4,250",
"n": 1,
"size": "1024x1024"
}'
The image generation API creates vivid, detailed images based on your text prompts, opening up new possibilities for e-commerce visual content creation without the need for expensive photography or design resources.
// Extract text from image and use it with file reference
$textData = $openAiService->extractTextFromImage('/path/to/image.jpg', $googleAccessToken);
$messages = [
['role' => 'system', 'content' => 'Compare the text from this image with the uploaded document.'],
['role' => 'user', 'content' => 'Image text: ' . $textData['text']]
];
$comparison = $openAiService->sendFileReferenceChatRequest($messages, $fileId, 'gpt-4', $openAiApiKey);
// Convert speech to text and then analyze with AI
$transcript = $openAiService->speechToText('/path/to/audio.mp3', $googleAccessToken);
$messages = [
['role' => 'system', 'content' => 'You are an expert content analyzer.'],
['role' => 'user', 'content' => 'Analyze this transcript: ' . $transcript['transcript']]
];
$analysis = $openAiService->sendChatRequest($messages, 'gpt-3.5-turbo', $openAiApiKey);
Cache-Augmented Generation works by:
This approach is especially valuable in e-commerce where many customer questions follow predictable patterns and where response speed directly impacts conversion rates.
8. Retrieval-Augmented Generation (RAG)
< 10000 a id="user-content-8-retrieval-augmented-generation-rag" class="anchor" aria-label="Permalink: 8. Retrieval-Augmented Generation (RAG)" href="#8-retrieval-augmented-generation-rag">RAG is particularly valuable for Magento stores with:
curl https://api.openai.com/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer YOUR_API_KEY" \
-d '{
"model": "gpt-4",
"messages": [
{
"role": "user",
"content": "Find me a gun safe under $2000"
}
],
"functions": [
{
"name": "search_safes",
"description": "Search safes by type and price",
"parameters": {
"type": "object",
"properties": {
"type": {
"type": "string",
"description": "The type of safe (gun, jewelry, document)"
},
"max_price": {
"type": "number",
"description": "Maximum price in USD"
}
},
"required": ["type", "max_price"]
}
}
],
"function_call": "auto"
}'
AI Response (converts natural language to structured function call):
{
"function_call": {
"name": "search_safes",
"arguments": "{\"type\":\"gun\",\"max_price\":2000}"
}
}
Function calling bridges the gap between conversational AI and your Magento backend systems, enabling rich, action-oriented customer experiences while maintaining control over business logic and data access.
curl https://api.openai.com/v1/audio/transcriptions \
-H "Authorization: Bearer YOUR_API_KEY" \
-H "Content-Type: multipart/form-data" \
-F file=@audio.mp3 \
-F model=whisper-1
The Audio Transcription API uses OpenAI's Whisper model to accurately convert spoken audio to text, supporting multiple languages and various output formats for different use cases.
curl https://api.openai.com/v1/audio/speech \
-H "Authorization: Bearer YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "tts-1",
"voice": "alloy",
"input": "Your Magento order has been confirmed and will be shipped within 24 hours. Thank you for shopping with us!"
}'
The Text-to-Speech API transforms written text into natural-sounding speech, enabling you to create audio content dynamically from your Magento store's data.
// Example code to get Google access token
$accessToken = $googleAuthService->getAccessToken($serviceAccountKeyFile);
pdf,txt,csv,json,docx,xlsx
The Customer Service Chatbot is a powerful AI-powered interface for your storefront that helps answer customer questions, provide product information, and handle common support requests.
Once configured, the chatbot appears as a floating button in the bottom-right corner of your store. Customers can:
The "Send Magento Data to AI" configuration allows you to control whether Magento data is sent to the AI for processing. This setting is useful for managing data privacy and ensuring that only necessary data is shared with the AI.
When enabled, this configuration allows the AI to receive and process Magento data, which can be used to generate SQL queries or provide insights based on the data. The AI can assist with tasks such as:
Magento MCP AI allows you to configure the base domain for the OpenAI API. This means you can use not only the official OpenAI endpoints, but also self-hosted, proxy, or alternative OpenAI-compatible AI engines (such as local LLMs, OpenRouter, or other providers that implement the OpenAI API spec).
Below is a list of popular AI engines and providers that implement the OpenAI API (chat/completions, embeddings, etc.) and can be used with Magento MCP AI by setting the API domain and using a compatible API key:
This flexibility allows you to use the Magento MCP AI Assistant with a wide range of AI backends, including private, on-premise, or alternative cloud providers, as long as they are OpenAI API compatible.