Improved Optimization and Speed up in Big Stream Data Processing

Main Article Content

Vivek Kumar, et. al.

Abstract

To provide the low latency, higher throughput & speedup in stream processing, there should be systematic flow design which can accept the continuous incoming stream and provide it to the different operators to work parallel on the incoming stream. In this paper, we have implemented the proposed pipeline and watermark on Apache Beam along with google cloud dataflow as a runner. The experiments have been carried on stock market dataset, by considering the prices of oil, us dollar and gold as essential dependent parameters. Result of experiments proved that there is a relationship exist between the stock price and those dependent parameters. Now as the prediction of stock market essentially required other dependent parameter to be present which are originally from distributed environment, any parameter delay affect the result of prediction and introduced the good optimization and low latency. To implement the effective stream processing, we have used pipeline and watermark concept to handle and reduced any such delay and increase the speedup in big data stream processing.

Downloads

Download data is not yet available.

Metrics

Metrics Loading ...

Article Details

How to Cite
et. al., V. K. . (2021). Improved Optimization and Speed up in Big Stream Data Processing . Turkish Journal of Computer and Mathematics Education (TURCOMAT), 12(12), 3301–3305. https://doi.org/10.17762/turcomat.v12i12.8012
Section
Articles