- Oktober 11, 2018
- Diposting oleh: Tim Wiki Valas
- Kategori: Sistem Perdagangan Valas
※ Neural Network ─ Hull Transferring Common (HMA) & Deviation-Scaled Transferring Common (DSMA) ※
☛ Menggunakan algoritma HMA namun algoritma ini merupakan variasi dari low-lag hingga zero-lag
after which... fused with the next:
↓
☛ Filter Jurik/Memperhalus dan varietas MA yang disesuaikan
☛ Mixed with Deviation-Scaled Transferring Common Algorithm
☛ Lebih tinggi & Greatest Method (Lebih tinggi & perhitungan APB)
Menyadari: Greatest use with Volumes on Primary Chart indicator ─ advisable for indicator set off/replace motion
Transient principle of Neural Networks:
The Neural community is an adjustable mannequin of outputs as capabilities of inputs. It consists of a number of layers:
- enter layer, which consists of enter knowledge
- hidden layer, which consists of processing nodes known as neurons
- output layer, which consists of 1 or a number of neurons, whose outputs are the community outputs.
All nodes of adjoining layers are interconnected. These connections are known as synapses. Each synapse has an assigned scaling coefficient, by which the info propagated by means of the synapse is multiplied. These scaling coefficient are known as weights (w[Saya][j][k]). In a Feed-Ahead Neural Network (FFNN) the info is propagated from inputs to the outputs. Right here is an instance of FFNN with one enter layer, one output layer, and two hidden layers:
The topology of a FFNN is commonly abbreviated as follows: <# of inputs> - <# of neurons within the first hidden layer> - <# of neurons within the second hidden layer> -...- <# of outputs>. The above community might be known as a 4-3-3-1 community.
The info is processed by neurons in two steps, correspondingly proven throughout the circle by a summation signal and a step signal:
- All inputs are multiplied by the related weights and summed
- The ensuing sums are processed by the neuron's activation operate, whose output is the neuron output.
It's the neuron's activation operate that offers non-linearity to the neural community mannequin. With out it, there isn't a motive to have hidden layers, and the neural community turns into a linear auto-regressive (AR) mannequin.
✜ Deviation-Scaled Transferring Common (DSMA) ✜
Yang baru DSMA dibuat oleh John Ehlers dan ditampilkan pada bulan Juli 2018 situation of TASC journal.
The DSMA is a knowledge smoothing method that acts as an exponential transferring common with a dynamic smoothing coefficient. The smoothing coefficient is mechanically up to date primarily based on the magnitude of value modifications. Within the Deviation-Scaled Transferring Common, penyimpangan yang biasa dari maksudnya dipilih menjadi ukuran besaran ini. The ensuing indicator offers substantial smoothing of the info even when value modifications are small whereas shortly adapting to those modifications.
The writer explains that because of its design, ia memiliki jeda minimal tetapi siap menghadirkan kehalusan yang cukup besar. Namun, Jaringan Syaraf - HMA & DSMA indicator is fused with Jurik filters/smoothing mixed with zero-lag HMA system.
☝ I can't present any kind of assist like coding (bersama dengan kode pasokan) dan layanan pemecahan masalah.
☢ There are not any ensures that this indicator work completely or with out errors. Karena itu, use at your personal danger; Saya tidak menerima tanggung jawab hukum atas kerusakan sistem, kerugian moneter dan bahkan hilangnya nyawa.
final replace:
8:00 PAGI
Kamis, 11 Oktober 2018
Greenwich Menyiratkan Waktu (GMT)