Product and Price Monitoring System: Building an Automated Alert and Notification Workflow
For those operating on e-commerce platforms like Taobao or conducting market analysis, real-time tracking of product prices and inventory fluctuations is crucial for promotional strategies, price competition, and informed decision-making. Even slight variations in pricing can signal supply chain issues, competitor activity, or the start of promotional campaigns. Therefore, establishing a comprehensive monitoring system provides a solid foundation for business decisions.
This article details how to build a fully automated product and price monitoring system, leveraging scheduled crawling, data comparison, and configurable thresholds to trigger alerts and send real-time notifications. It helps you capture opportunities early and improve operational efficiency, while offering guidance for further expansion and optimization.
1. System Architecture Overview
The system is composed of five major modules, each playing a critical role in achieving a fully automated monitoring pipeline:
Crawling Engine (Scrapy): Periodically fetches product page information including price, stock, and name.
Data Storage (MongoDB): Stores product historical data in a time-series format to support comparison and query.
Change Detection Logic: Compares old and new data based on pre-set rules to identify significant changes.
Notification Module (Email/Slack/Webhook): Pushes real-time alerts when anomalies are detected.
Configuration and Threshold Center: Allows users to set customized rules per product such as price fluctuation percentage and minimum inventory level.
The architecture is modular and scalable, making it easy to integrate new features or extend coverage to more products.
2. Data Model Design
Each product snapshot is stored as a JSON document to ensure consistent structure and ease of querying. It includes a timestamp and unique product identifier sku_id
:
{"sku_id": "1234567890",
"title": "2024 Summer Cotton T-shirt",
"price": 59.9,
"stock": 48,
"shop_name": "Tmall Flagship Store",
"timestamp": "2025-05-05T12:00:00"
}
Each update creates a new record, allowing a complete historical track of the product's price and stock evolution.
3. Crawling Engine Design (Scrapy)
Scrapy is a powerful web scraping framework that supports scheduled tasks and data extraction. Below is a simplified spider example:
import scrapyfrom datetime import datetime
class TaobaoPriceSpider(scrapy.Spider):
name = 'price_monitor'
start_urls = [
'https://detail.tmall.com/item.htm?id=1234567890',
]
def parse(self, response):
yield {
'sku_id': '1234567890',
'title': response.css('h1::text').get().strip(),
'price': float(response.css('.tm-price::text').get()),
'stock': int(response.css('#J_SpanStock::text').get()),
'shop_name': response.css('.slogo-shopname strong::text').get(),
'timestamp': datetime.utcnow().isoformat()
}
The crawler can be scheduled using Crontab, Airflow, or APScheduler to run every 10 minutes or hourly.
To support multiple product pages, simply expand the start_urls
list and implement the corresponding parsing logic.
4. Change Detection and Alert Logic
Configurable Alert Rules
Customizable rules allow the system to detect abnormal changes based on historical data comparison. For example:
Send alerts when the price drops more than 10%
Send alerts when inventory falls below 5 units
Optional: Trigger alerts for price hikes or out-of-stock situations
Comparison Logic Example
def compare_and_alert(latest, previous):price_change = (previous['price'] - latest['price']) / previous['price']
if price_change >= 0.1:
send_alert(
f"Price Drop Alert: {latest['title']}",
f"Price dropped from {previous['price']} to {latest['price']}"
)
if latest['stock'] <= 5:
send_alert(
f"Low Stock Alert: {latest['title']}",
f"Only {latest['stock']} items left in stock"
)
This logic can be extended for different product categories, enabling rule customization to improve alert accuracy.
5. Notification and Integration: Automated Pushing
Once anomalies are detected, the system can automatically push alert messages via different channels:
1. Email Notification
SMTP can be used for sending emails, suitable for formal or internal alerts.
import smtplibfrom email.mime.text import MIMEText
def send_email(subject, content):
msg = MIMEText(content, 'plain', 'utf-8')
msg['Subject'] = subject
msg['From'] = 'bot@example.com'
msg['To'] = 'user@example.com'
smtp = smtplib.SMTP_SSL('smtp.example.com', 465)
smtp.login('bot@example.com', 'yourpassword')
smtp.sendmail('bot@example.com', ['user@example.com'], msg.as_string())
smtp.quit()
2. Slack / Discord Webhook Notification
Suitable for real-time team collaboration and mobile notifications:
import requestsdef send_webhook(message, webhook_url):
payload = {"text": message}
requests.post(webhook_url, json=payload)
Support for other channels such as Telegram or LINE can be added easily.
6. Incremental Storage and Query Optimization
To handle large volumes of historical data efficiently, the following optimization strategies are recommended:
MongoDB Optimization
Create compound indexes:
sku_id + timestamp
to speed up time-series queries.Use capped collections or TTL indexes to automatically remove outdated data and control storage usage.
Historical Price Trend Query
Use the Aggregation Pipeline to retrieve recent price records for analysis or visualization:
db.price_records.aggregate([{ "$match": { "sku_id": "1234567890" } },
{ "$sort": { "timestamp": -1 } },
{ "$limit": 10 },
{ "$project": { "price": 1, "timestamp": 1 } }
])
This design can be extended to support statistical analysis such as average price, standard deviation, or volatility range.
7. Scalability Recommendations
With a solid foundation, the system can be further enhanced to support advanced business needs:
Modular Configuration Center: Separate monitoring rules from data logic, and provide a user interface for rule management.
Monitoring Dashboard: Integrate with Grafana or Kibana to visualize real-time price changes and alert history.
Behavior Prediction: Incorporate machine learning models to predict future price trends and support proactive decisions.
Notification History Log: Record each alert to prevent duplicate notifications within short intervals.
API Access: Provide RESTful APIs for third-party integration and data access.
Conclusion
This article presented a complete architecture for a product and price monitoring system, covering the entire process from data crawling and change detection to automated alerting and optimized querying. With this system in place, e-commerce operators and market analysts can respond swiftly to pricing or inventory changes, enhance business agility, and make more informed decisions. With further module integration and intelligent features, this solution can evolve into a highly automated and predictive commercial monitoring platform.
Articles related to APIs :
Real-Time Inventory Synchronization: A Full Implementation of Taobao Inventory API
Utilizing Marketing APIs: Automating Coupon and Campaign Management
From Data to Product: Building Search, Visualization, and Real-Time Data Applications
Enhanced Data Insights: Analyzing Taobao Product Trends and Anomalies with the ELK Stack
If you need the Taobao API, feel free to contact us : support@luckdata.com