IndexIntroductionCloud computingCharacteristics of Cloud ComputingRelationship between Cloud Computing and Big DataLiterature ReviewConclusionInterconnection using information technology in different methods creates large amounts of data. Such data requires dispensing and storage. Cloud is an online storage model where data is stored on many virtual servers. The distribution of Big Data denotes a new challenge in the IT field, particularly in cloud computing. Data processing involves obtaining, storing and analyzing data. In this detail, there are several questions that matter: What is the relationship between big data and cloud computing? The answer to these questions will be discussed in this paper, where big data and cloud computing will be studied, as well as knowing the relationship between them in terms of security and challenges. This article describes the relationship between big data and cloud computing and reviews the literature on big data for cloud computing. Keywords: Big Data, Analytics, BIG dataV, cloud computing1) Say no to plagiarism. Get a tailor-made essay on "Why Violent Video Games Shouldn't Be Banned"? Get an original essay IntroductionThe term Big Data is a quantity of aspects that does not become easy to process with the usual fact supervision apparatuses or by distributing an application. The facts come from everywhere: sensors used to gather rankings, posts on community media sites, digital images and videos, getting a contract account, and cell phone GPS signal. We live in a world where data increases rapidly due to the Internet, sensors and increasingly second-hand heavy machinery. According to Gartner, information increases at a rate of 59% each year. This development can be represented by the term of these four V. Volume The association or people generate a large amount of facts is called amount. Today the amount of data in almost every organization is imminent Exabytes. According to IBM, there are more than 2.7 zeta bytes of data in the digital world today. All of the 571+ new websites are produced by organizations. Speed: The pace at which facts are generated, acquired and shared is known as velocity. The project can only benefit from statistics if they are captured and shared in real time. Variety: Dissimilar source types cause facts such as internal, external, social, and behavioral, and are close in design to antipodes such as metaphors, textbooks, videos, audio, etc. Veracity: Refers to the indecisiveness of the facts, i.e. whether the data obtained is accurate or unshakable. Life-sized facts, superior to all the shapeless and semi-structured ones, are a chaotic scene and it takes a good deal of initiative and skill to clean up those facts and make them suitable for psychotherapy. The type and nature of data: Big data comes from different sources with sensors and free texts such as social media, unstructured data, metadata and other spatial data collected from web logs, GPS, medical devices, etc. Big data is balanced by dissimilar practical aspects, so it is in a series of modules, as well as:Structured data: is the data systematized in the form of tables or databases to be managed.Unstructured data: characterizes the main part of the data; is the data that people produce on a daily basis such as texts, images, videos, messages, log records, click-streams, etc. Semi-structured data: or multi-structured, we observe a type of data that is structured but not drawn in tables or databases, for example XML or JSON documents. Cloud computingCloud computing is a rapidly growing skill that has established itself in the next group of the IT and business sector. Cloud computing offers reliable software, hardware, and IaaS delivered over the Internet and dataremote centers. Cloud services must develop a dominant framework to accomplish large-scale composite computing tasks and encompass a range of IT meanings from storage and abstraction to database and application services. The need to store, process and analyze large volumes of data sets has pushed various organizations and individuals to adopt cloud computing. Many technical applications for large experiments are currently deployed in the cloud and may continue to increase because it can be provisioned and released quickly with minimal management effort or interaction with the service provider. Cloud computing has several favorable aspects to address the rapid growth of economies and technological barriers. Cloud computing offers full cost of ownership and allows organizations to focus on their core business without worrying about issues such as resource organization, elasticity and serviceability. Furthermore, the combination of the utility model of cloud computing and a rich set of computations, organizations and cloud storage services offers a truly amazing situation in which researchers can carry out their experiments. Cloud provision replicas normally contain PaaS, SaaS and IaaS.PaaS, such as Google Apps Engine, Salesforce. com, Force stage and Microsoft Azure, state that several resources work on a cloud to provide computing platforms for end users. SaaS, such as Google Docs, Gmail, Salesforce. com and Online Payroll refer to applications that work on distant cloud infrastructure offered by the cloud service provider as facilities that can be accessed via the Internet. IaaS, such as Amazon's Flex Scale and EC2, refers to hardware equipment that works on a cloud provided by service providers and used by end users on demand. The growing appreciation of wireless systems and mobile campaigns has brought cloud computing to new heights due to the inadequate distribution, storage volume and sequence generation expertise of each device. Characteristics of Cloud ComputingCloud computing is a single dispersed pattern indicating a refined model. NIST recognized the main characteristics of the cloud, summarizing the idea of cloud computing into five characteristics: On-demand self-service: Cloud services dispatch computing resources such as storage and processing as desired and without any human intervention. Wide network access: Cloud computing resources are available on the network, mobile and smart devices and even sensors can access the computing resources on the cloud. Resource Pool: Cloud platform users share a huge collection of computing resources; users can control the nature of the assets and geographic location they prefer, but cannot determine the exact physical location of those assets. Rapid Elasticity: Resources from storage media, network, compute, and applications are continuously accessible and can be scaled up or down in an almost rapid-style fashion, which allows for scale-up to ensure optimal resource utilization. Measured service: Cloud systems can measure procedures and resource exhaustion, as well as surveillance, control and reporting in a completely transparent way. Relationship between cloud computing and big dataCloud computing and big data are linked. Big Data gives workers the ability to use commodity computing to process distributed queries across multiple datasets and return the resulting sets in a timely manner. Cloud computing provides the core engine using Hadoop, a class of dispersed data processing stages. Big data sources from the cloud andfrom the Web are stored in a fault-tolerant distributed database and processed through a large dataset programming model with a parallel distributed algorithm in a cluster. The main purpose of data visualization, as shown in Fig. 2, is to display analytical results presented visually through different graphs for decision making. Big data exploits spread cloud computing-based storage skills over local storage involved in a processor or electronic trick. Big data evaluation is driven by rapidly growing cloud-based applications developed using different cloud-based technologies must cope with this new environment because managing big data for concurrent processing has become increasingly complicated. MapReduce is a good example of big data processing in a cloud environment; enables processing of large quantities of datasets stored in parallel in the cluster. Cluster computing reveals a decent presentation in dispersed pattern environments, such as processor control, storage, and network transports. Likewise, Boldly and Firestone highlighted the ability of cluster computing to provide a favorable environment for data growth. However, Miller argued that the lack of data availability is costly because users offload more decisions to analytical methods; incorrect use of methods or inherent weaknesses in virtualized technologies. Therefore, cloud computing not only offers services for the aggregation and distribution of big data but also obliges as a classic package. For cloud-based big data analytics, there are some contexts such as Google, Map reduce, Spark, Twister, Hadoop and Hadoop Reduce and ++. These agendas are discarded for packing and dispensing figures. To provide this data, it could be some building records like HBase, Big table and Hadoop DB.Literature ReviewSaeed Ullah, M. Daud Awan and Sikander Hayat khayal et. al. The author identifies some key features that characterize Big Data frameworks, as well as the challenges and problems associated with them. The authors use various evaluation metrics from different aspects to identify usage scenarios of these platforms. The author reviewed several Big Data resource management frameworks and studied the advantages and disadvantages of each of them. The author carried out performance evaluation of resource management engines based on seven key factors and each of the frameworks was ranked based on empirical evidence. Blesson Varghese & Rajkumar Buyya et. al. We first discuss the evolution of cloud infrastructure and consider the use of multi-vendor infrastructure and the benefit of decentralizing processing away from data centers. These models must result in the need for a variety of new computing constructions that will be obtainable by the future cloud organization. These constructions are estimated to influence parts such as people and devices, data-intensive computing, service space, and self-learning frameworks. The author designs a roadmap of competitions that will need to be addressed to realize the potential of next-generation cloud classification. Qusay Kanaan Kadhim and Robiah Yusof et. This entire study aims to review and classify the issues surrounding the implementation of cloud computing, a hot area that needs to be addressed by future research. The author states that the security problem has become even more complex with the cloud model as new areas have entered the scope of the problem associated with the model's data security, network securityusers' privacy and platform and infrastructure problems. This study was designed to highlight the security issues of cloud computing. The findings of this study highlight that there are five main issues associated with the implementation of cloud computing: cloud government application mobility and security issues, cloud security services and applications, cloud data security, cloud network security issues, and cloud network security issues. cloud security platform and infrastructure. These issues provide an open space for future research to bridge the security gap by providing a technical approach or empirical model to mitigate these issues. ConstandinosX, Mavromoustakis, Georgios Skourletopoulos, & et. alThe author presents a review of current big data research, exploring applications, opportunities, and challenges, as well as the cutting-edge techniques and underlying representations that venture cloud computing competencies, such as big data-as-aa-service ( BDaaS) or analytics as a service (AaaS). The authors suggest that a cost-benefit analysis is also needed to measure the long-term benefits of adopting big data-as-a-service business models to support data-driven decision making and communicate the results to non-technical stakeholders.Samir A. El-Seoud, Hosam F. El-Sofany and Mohamed Abdelfattah et. The entire document introduces the characteristics, trends and challenges of big data. The authors investigate the benefits and risks that can arise from the integration between big data and cloud computing. The authors suggest that the main advantage of cloud computing and big data integration is data storage and the availability of processing power, the cloud has access to a large pool of resources and various forms of infrastructure that can accommodate this integration in the best possible way; with minimal effort the environment can be configured and managed to enable an exceptional engagement universe for all big data requirements such as data analytics. This in turn offers low impact complications with great efficiency. The authors say that today's knowledge and development in the field have not stunned them and give the cloud a wide advantage as it is the most practical solution for hosting and processing big data environments. Nabeel Zanoon, Abdullah Al Haj and Sufian M Khwaldeh et. al. The authors suggested a term for big data and a model that illustrates the relationship between big data and cloud computing. Big data and cloud computing have been studied in several important aspects and we have concluded that the relationship between them is complementary. Big data and cloud computing are an integrated model in the world of distributed network technology. The development of big data and its needs is a factor that motivates cloud service providers to continuously develop, because the relationship between them is based on product, storage and distribution as a joint cause. Big data represents the product and the cloud represents the containers. Big data and cloud computing are moving towards rapid advancements to keep pace with advances in technology and user requirements. SamiyaKhan, Kashish A. Shakil & MansafAlam et. al. The idea of big data discusses the accurate and data analysis methods that can support the charity for big data and rebounds the catalog of existing prevailing approaches, contexts and phases for different big data replications. Also evaluate the feasibility of cloud-based big data computing, carefully examine the existing challenges and opportunities. Big data information tells and demands every step of human beings' lives. There is no pattern permitted by knowledge that cannotenable the use of big data-based keys for better creation of precise industry conclusions and demands. However, to make this technology commercially viable, research teams must identify potential “big” datasets and possible analytical applications for the field in question. That said, the feasibility and commercial viability of such analytical applications must be aligned with business and customer requirements. /Xiaoxia wang and zhanqiang LI ET. At The authors present the big data route map that relies on cloud computing to make urban traffic and transportation smarter by pattern extraction and visualization. Quickly illustrated data addresses, classifies associations and considers advanced and surprising customs for the most interesting developed current evidence. The cloud computing cylinder seamlessly converts antiquated regime services, benefits the kingdom to align structure invention with management approach, and creates intelligent executive networks that encourage effective collaboration. O,Izang A. A & Kuyoro S. O et. al. According to the authors, cloud computing helps to attack the issue of data storage and structure. After examining some of the issues related to big data and cloud computing, specific solutions have been recommended to improve the two main notions that will drive a large development mode to snowball the frequency of adoption of cloud computing by organizations. It is important that administrations think about the nature of how their data will be produced in the future before arranging any haze services in their advertising. The authors suggest that for the future trend of steadily increasing data, which is expected to double on an annual basis, research should continue in these two areas to see how the two key concepts can be improved and how the problems and challenges can be reduced to a minimum. minimum.Pedro Caldeira Neves, Bardley Schmer & Jorge Bernardino et. al. The cloud environment strongly leverages the Big Data solution by providing a fault-tolerant, scalable and available environment to Big Data systems. Although big data patterns are influential patterns that enable both creativity and punishment to realize data dreams, there are some concerns that need further study. It is necessary for the additional fight to be operational in emergency port procedures and ordering forms. The authors recommend, through the required adjustable apparatuses, to modify an explanation for applying elasticity to different dimensions of big data systems running on cloud environments. The goal is to examine appliances that modify the use of the software container to enable scalability at different stages in the cloud treasure. Chaowei Yanga and Qunying Huangb et. al. The author presents future innovations and a research agenda for cloud computing that supports transforming the volume, velocity, variety, and truthfulness into values of Big Data for local to global Earth sciences and digital applications. In this research the initiatives address the 10 aspects expected to produce the next generation of valuable technology-based companies, as identified by the McKinsey report. Ibrahim Abaker Targio Hashem a & Ibrar Yaqoob et. al. The use of cloud services for data collection, processing and examination has been around for some time; it has changed the context of information technology and distorted the possibilities of classic on-demand delivery by hooking it on authenticity. In this study, we analyzed the growth of big data in cloud computing. The author wanted a big data organization, a theoretical view of big data and.
tags