location:Home > 2019 VOL.2 Feb No.1 > A Traffic Flow Prediction Algorithm Using Deep Confidence Network Based on MapReduce Platform

2019 VOL.2 Feb No.1

  • Title: A Traffic Flow Prediction Algorithm Using Deep Confidence Network Based on MapReduce Platform
  • Name: Guanyu Yang, Yuexin Li
  • Company: China University of Petroleum
  • Abstract:

    A traffic flow prediction method using parallel deep learning is proposed. The method is based on MapReduce parallel computing framework, combined with deep belief network model and distributed memory computing structure, builds distributed memory environment, and establishes data fragmentation processing and multi-task scheduling mechanism. The deep confidence network model is used to train the model in the form of multiple parallel asynchronous parallel calculations, and the dropout method is used to prevent the model from over-fitting. The improved parallel deep confidence network model is used to learn traffic flow characteristics, and the SoftMax regression model is connected to the top of the network for traffic prediction. The experimental results show that the prediction accuracy and time complexity of the improved parallel deep confidence network model are better than the traditional prediction model in the actual traffic flow data prediction.

  • Keyword: Deep Learning; Traffic Flow Prediction; Restricted Boltzmann Machine Model; Distributed Memory Computing; Deep Confidence Networ
  • DOI: 10.12250/jpciams2019010127
  • Citation form: Guanyu Yang, Yuexin Li.A Traffic Flow Prediction Algorithm Using Deep Confidence Network Based on MapReduce Platform[J]. Computer Informatization and Mechanical System, 2019, vol. 2, pp. 7-13.
Reference:

[1] Hu W, Yan L, Liu K, et al. A Short-term Traffic Flow Forecasting Method Based on the Hybrid PSO-SVR[J]. Neural Processing Letters, 2016, 43(1):155-172.

[2] Sun S, Zhang C, Zhang Y. Traffic Flow Forecasting Using a Spatio-temporal Bayesian Network Predictor[J]. Lecture Notes in Computer Science, 2017, 5(9):273-278.

[3] Hong W C, Dong Y, Zheng F, et al. Hybrid evolutionary algorithms in a SVR traffic flow forecasting model[J]. Applied Mathematics & Computation, 2011, 217(15):6733-6747.

[4] Deng L, Yu D. Deep Learning: Methods and Applications[J]. Foundations & Trends in Signal Processing, 2014, 7(3):197-387.

[5] Dan C C, Meier U, Gambardella L M, et al. Deep Big Simple Neural Nets Excel on Handwritten Digit Recognition[J]. Neural Computation, 2010, 22(12):3207 - 3220.

[6] Le Q V. Building high-level features using large scale unsupervised learning[C]// IEEE International Conference on Acoustics, Speech and Signal Processing. IEEE, 2013:8595-8598.

[7] Martens J. Deep learning via Hessian-free optimization[C]// International Conference on International Conference on Machine Learning. Omnipress, 2010:735-742.

[8] Lv Y, Duan Y, Kang W, et al. Traffic Flow Prediction With Big Data: A Deep Learning Approach[J]. IEEE Transactions on Intelligent Transportation Systems, 2015, 16(2):865-873.

[9] Huang W, Song G, Hong H, et al. Deep Architecture for Traffic Flow Prediction: Deep Belief Networks With Multitask Learning[J]. IEEE Transactions on Intelligent Transportation Systems, 2014, 15(5):2191-2201.

[10] Hinton G E, Osindero S, Teh Y W. A fast learning algorithm for deep beliefnets [J]. Neural Computation, 2006, 18 (7): 1527

[11] Fischer A. Training Restricted Boltzmann Machines[J]. KI - Künstliche Intelligenz, 2015, 29(4):441-444.

[12] Hinton G E. A Practical guide to training restricted Boltzmann machines [J]. Momentum, 2012, 9 (1): 599-619.

[13] Dean J, Ghemawat S. MapReduce: A Flexible Data Processing Tool[J]. Communications of the Acm, 2010, 53(1):72-77.

[14] Zaharia M, Chowdhury M, Das T, et al. Resilient distributed datasets: a fault-tolerant abstraction for in-memory cluster computing[C]// Usenix Conference on Networked Systems Design and Implementation. USENIX Association, 2012:2-2.

[15] Chandrasekar S, Dakshinamurthy R, Seshakumar P G, et al. A novel indexing scheme for efficient handling of small files in Hadoop Distributed File System[C]// International Conference on Computer Communication and Informatics. IEEE, 2013:1-8.

[16] Duchi J, Hazan E, Singer Y. Adaptive Subgradient Methods for Online Learning and Stochastic Optimization[J]. Journal of Machine Learning Research, 2011, 12(7):257-269.

[17] Park S W, Park J, Bong K, et al. An Energy-Efficient and Scalable Deep Learning/Inference Processor With Tetra-Parallel MIMD Architecture for Big Data Applications[J]. IEEE Transactions on Biomedical Circuits & Systems, 2016, 9(6):838-848.

[18] Du T, Li L. Deep Neural Networks with Parallel Autoencoders for Learning Pairwise Relations: Handwritten Digits Subtraction[C]// IEEE International Conference on Machine Learning & Applications. 2016.

[19] Mao B, Fadlullah Z M, Tang F, et al. Routing or Computing? The Paradigm Shift Towards Intelligent Computer Network Packet Transmission Based on Deep Learning[J]. IEEE Transactions on Computers, 2017, 66(11):1946-1960.

[20] Dao M S, Dao M S, Mezaris V, et al. Deep Learning for Mobile Multimedia: A Survey[J]. Acm Transactions on Multimedia Computing Communications & Applications, 2017, 13(3s): 34-42.


 


Tsuruta Institute of Medical Information Technology
Address:[502,5-47-6], Tsuyama, Tsukuba, Saitama, Japan TEL:008148-28809 fax:008148-28808 Japan,Email:jpciams@hotmail.com,2019-09-16