MQL5 Algo Trading
388K subscribers
2.57K photos
2.57K links
The best publications of the largest community of algotraders.

Subscribe to stay up-to-date with modern technologies and trading programs development.
Download Telegram
The article highlights the FITS (Frequency Interpolation Time Series) method, an innovative approach to time series analysis and forecasting. FITS effectively represents time series data in the frequency domain, using frequency interpolation to expand the analyzed time window with low computational overhead. It employs complex neural networks to assess amplitude and phase, ensuring efficient data analysis. FITS stands out with its low parameter count, making it suitable for resource-constrained environments like mobile devices. Implemented in MQL5, FITS method leverages fast Fourier transform and OpenCL for efficient calculation, creating a scalable solution for both traders and developers interested in advanced algorithmic trading models.
#MQL5 #MT5 #TimeSeries #AlgoTrading

Read more...
πŸ‘19❀11✍2πŸ‘Œ1πŸ‘¨β€πŸ’»1
Discover how ATFNet leverages the dual analysis power of time and frequency domains for improved time series forecasting. By integrating complex number operations and advanced neural network architectures, ATFNet addresses challenges in spectral prediction and local-global dependencies. The method includes an extended DFT for better frequency alignment and a unique energy weighting mechanism to optimize predictions. Transformative features like Complex Spectral Attention allow the collection of diverse frequency combinations, while specialized blocks handle local time dependencies and global frequency characteristics independently. Trials on real datasets show ATFNet surpasses many existing models, offering significant advancements for traders and developers in algorithmic trading.
#MQL5 #MT5 #ATFNet #TimeSeries

Read more...
❀32πŸ‘19πŸ‘Œ4πŸ‘¨β€πŸ’»4✍2⚑1
In the previous installment, an in-depth examination of the ATFNet algorithm was conducted, showcasing its combination of time and frequency domain forecasting models. The ATFNet architecture employs the Transformer’s multi-layer Encoder with multi-headed Self-Attention, particularly in its frequency F-Block, utilizing complex number mathematics. This piece delves into the implementation of the ATFNet class by crafting the CNeuronATFNetOCL class, which encapsulates the entire algorithm. This approach, though not optimal within a single class structure, aligns with previous sequential models that lack support for multi-process operations inherent to T-Block and F-Block.

The internal structure, declared statically, simplifies by separating objects into time and frequency blocks, ensuring clarity and organizational efficiency. It begins with internal o...
#MQL5 #MT5 #Algorithm #TimeSeries

Read more...
πŸ‘25❀19πŸ‘Œ13✍2⚑2πŸ†2😁1
The article delves into the challenges faced by traditional models in capturing long-term dependencies in time series data, often due to their limited ability to explore relationships between distant data segments. It introduces the Segment, Shuffle, and Stitch (S3) mechanism, designed to optimize time series representation by strategically rearranging data for improved learning. S3 segments the series, optimizes segment order through learnable weights, and combines data using a weighted sum. This modular mechanism seamlessly integrates with existing models, enhancing training procedures and performance significantly. The method is computationally efficient, with minimal hyperparameters, and can drastically improve forecasting and classification model accuracy when integrated into neural architectures.
#MQL5 #MT5 #TimeSeries #AI

Read more...
πŸ‘40❀22πŸ‘Œ5πŸ”₯2πŸ€“2πŸ‘¨β€πŸ’»2
Time series analysis plays a crucial role in fields like finance, allowing for the prediction of future trends using sequences of observations collected over time. Deep learning models have shown effectiveness in capturing nonlinear relationships and handling long-term dependencies in time series data. The MSFformer model introduces a multi-scale feature extraction approach, efficiently integrating long-term and short-term dependencies. Key components include the CSCM module, which constructs multi-level temporal information, and the Skip-PAM mechanism that processes input data at varying time intervals. These improvements enhance time series forecasting accuracy by effectively managing complex temporal relationships at multiple scales.
#MQL5 #MT5 #TimeSeries #DeepLearning

Read more...
❀29πŸ‘27πŸ‘¨β€πŸ’»4πŸ‘€4πŸ€”2πŸ‘Œ2✍1
The article delves into the Bidirectional Piecewise Linear Representation (BPLR) algorithm, a method to efficiently detect collective anomalies in time series data by reducing dimensionality. BPLR approximates datasets using linear functions, enabling swift analysis crucial for financial markets. Instead of processing raw time series data, BPLR divides datasets into segments, identifying Trend Turning Points (TTPs) for anomaly detection. For practical application, BPLR is implemented on the OpenCL platform to handle large datasets with lower computational costs. This piecewise linear approach offers a robust tool for developers and traders looking for efficient data segmentation and anomaly detection in dynamic environments.
#MQL5 #MT5 #Algorithm #TimeSeries

Read more...
πŸ‘34❀10πŸ‘¨β€πŸ’»3🀯1πŸ‘Œ1πŸ€“1
Time series forecasting in multivariate settings faces challenges due to the curse of dimensionality. Traditional models struggle with performance when data windows are insufficient, demanding innovative approaches for analysis. The Spatio-Temporal Information (STI) Transformation equation addresses these issues by translating multivariate spatial data into the temporal dynamics of the target variable, expanding sample size and countering short-term data limitations.

Utilizing Transformer-based models enhances this process through the Self-Attention mechanism. Transformers capture global relationships without considering variable distance, alleviating dimensionality issues. The Spatiotemporal Transformer Neural Network (STNN) demonstrated in research efficiently forecasts multivariate time series by integrating STI with Transformer architecture.
#MQL5 #MT5 #TimeSeries #AlgoTrading

Read more...
πŸ‘32❀10πŸ‘€4πŸ‘Œ2πŸ‘¨β€πŸ’»2
Recent research has highlighted limitations in deep learning-based time series modeling, noting that shallow networks or linear models often outperform them on certain benchmarks. As foundational models in NLP and CV advance, their adaptation to time series data remains a challenge. TEMPO is introduced as a generative pre-trained transformer model focused on time series forecasting. It uses GPT and a two-component analytical framework to address specific temporal patterns like trends and seasonality. TEMPO enhances prediction accuracy by decomposing time series data into components, enriching model performance through an innovative prompt-based approach. This approach combines trend, seasonality, and residuals for a cohesive forecasting solution.
#MQL5 #MT5 #TimeSeries #AITrading

Read more...
πŸ‘35❀33πŸ‘¨β€πŸ’»2
The article delves into advancements in multivariate time series forecasting with a focus on the LSEAttention framework. This approach addresses numerical instability in traditional Transformers used in long-term forecasting tasks, which are prone to issues like attention and entropy collapse. The integration of Log-Sum-Exp (LSE) and GELU activation functions within this framework enhances numerical stability and mitigates abrupt transitions in attention scores, ensuring a balanced distribution of attention across input sequences. Furthermore, the implementation aspects of LSEAttention in MQL5, including modifications to Softmax layers and Relative Attention modules, underscore the practical enhancements made to bolster forecasting accuracy and efficiency.

πŸ‘‰ Read | Docs | @mql5dev

#MQL5 #MT5 #TimeSeries
❀27πŸ‘2πŸ‘¨β€πŸ’»2
Multivariate time series forecasting is a machine learning task focusing on predicting future trends from historical data. This task is difficult due to feature correlations and temporal dependencies and finds real-world applications in sectors like healthcare and finance. Transformer-based architectures, although impactful in NLP and computer vision, face challenges in time series forecasting due to training instability, especially with smaller datasets. The "SAMformer" framework addresses this by using simplified architecture, incorporating Sharpness-Aware Minimization and channel-wise attention to improve training stability and generalization. SAMformer optimizes Transformers to perform competitively by tackling entropy and loss sharpness issues, introducing novel strategies to enhance model efficiency and reliability.

πŸ‘‰ Read | Signals | @mql5dev

#MQL5 #MT5 #TimeSeries
❀60πŸ‘13πŸ‘¨β€πŸ’»10πŸ‘Œ5
The Multitask-Stockformer framework is detailed in a multi-part analysis of its theoretical and practical aspects, focusing on MQL5 implementation. It integrates discrete wavelet transformation for time series analysis with multitask self-attention models to capture complex financial data dependencies. The framework consists of three core modules: time series decomposition, a dual-frequency spatio-temporal encoder, and a dual-frequency fusion decoder. Each module enhances the analysis and prediction accuracy by focusing on different frequency components. The system is designed to handle diverse market conditions effectively, providing trend analysis, anomaly detection, and dynamic market adaptability. Implementation efforts continue with key system components optimized for time series analysis.

πŸ‘‰ Read | Calendar | @mql5dev

#MQL5 #MT5 #TimeSeries
❀24πŸ‘7🀣3πŸ‘¨β€πŸ’»3
The Hidformer framework leverages a unique dual-tower encoder structure to effectively analyze and forecast complex multivariate time series, particularly beneficial for handling dynamic and volatile data. This framework excels in drawing out both explicit and hidden dependencies in the data through advanced attention mechanisms, enhancing the analysis of both temporal structures and frequency domains.

A significant feature of Hidformer is its recursive attention mechanism, which aids in capturing intricate temporal dependencies in financial data. The linear attention mechanism complements this by optimizing computations while maintaining training stability. Together, these components enable reliable forecasts, vital in high-volatility markets.

The model's multilayer perceptron-based decoder offers efficient sequence prediction, improving long-t...

πŸ‘‰ Read | Signals | @mql5dev

#MQL5 #MT5 #TimeSeries
❀12