Which of the following are usually good data sources? select all that apply.

Data analytics (DA) is the process of examining data sets in order to find trends and draw conclusions about the information they contain. Increasingly, data analytics is done with the aid of specialized systems and software. Data analytics technologies and techniques are widely used in commercial industries to enable organizations to make more-informed business decisions. Scientists and researchers also use analytics tools to verify or disprove scientific models, theories and hypotheses.

As a term, data analytics predominantly refers to an assortment of applications, from basic business intelligence (BI), reporting and online analytical processing (OLAP) to various forms of advanced analytics. In that sense, it's similar in nature to business analytics, another umbrella term for approaches to analyzing data. The difference is that the latter is oriented to business uses, while data analytics has a broader focus. The expansive view of the term isn't universal, though: In some cases, people use data analytics specifically to mean advanced analytics, treating BI as a separate category.

Data analytics initiatives can help businesses increase revenue, improve operational efficiency, optimize marketing campaigns and bolster customer service efforts. Analytics also enable organizations to respond quickly to emerging market trends and gain a competitive edge over business rivals. The ultimate goal of data analytics, however, is boosting business performance. Depending on the particular application, the data that's analyzed can consist of either historical records or new information that has been processed for real-time analytics. In addition, it can come from a mix of internal systems and external data sources.

Types of data analytics applications

At a high level, data analytics methodologies include exploratory data analysis (EDA) and confirmatory data analysis (CDA). EDA aims to find patterns and relationships in data, while CDA applies statistical techniques to determine whether hypotheses about a data set are true or false. EDA is often compared to detective work, while CDA is akin to the work of a judge or jury during a court trial -- a distinction first drawn by statistician John W. Tukey in his 1977 book Exploratory Data Analysis.

Data analytics can also be separated into quantitative data analysis and qualitative data analysis. The former involves the analysis of numerical data with quantifiable variables. These variables can be compared or measured statistically. The qualitative approach is more interpretive -- it focuses on understanding the content of non-numerical data like text, images, audio and video, as well as common phrases, themes and points of view.

At the application level, BI and reporting provide business executives and corporate workers with actionable information about key performance indicators, business operations, customers and more. In the past, data queries and reports typically were created for end users by BI developers who worked in IT. Now, more organizations use self-service BI tools that let executives, business analysts and operational workers run their own ad hoc queries and build reports themselves.

Advanced types of data analytics include data mining, which involves sorting through large data sets to identify trends, patterns and relationships. Another is predictive analytics, which seeks to predict customer behavior, equipment failures and other future business scenarios and events. Machine learning can also be used for data analytics, by running automated algorithms to churn through data sets more quickly than data scientists can do via conventional analytical modeling. Big data analytics applies data mining, predictive analytics and machine learning tools to data sets that can include a mix of structured, unstructured and semistructured data. Text mining provides a means of analyzing documents, emails and other text-based content.

Data analytics initiatives support a wide variety of business uses. For example, banks and credit card companies analyze withdrawal and spending patterns to prevent fraud and identity theft. E-commerce companies and marketing services providers use clickstream analysis to identify website visitors who are likely to buy a particular product or service -- based on navigation and page-viewing patterns. Healthcare organizations mine patient data to evaluate the effectiveness of treatments for cancer and other diseases.

Mobile network operators examine customer data to forecast churn; that enables them to take steps to prevent customers from defecting to rival vendors. To boost customer relationship management efforts, companies engage in CRM analytics to segment customers for marketing campaigns and equip call center workers with up-to-date information about callers.

Inside the data analytics process

Data analytics applications involve more than just analyzing data, particularly on advanced analytics projects. Much of the required work takes place upfront, in collecting, integrating and preparing data and then developing, testing and revising analytical models to ensure that they produce accurate results. In addition to data scientists and other data analysts, analytics teams often include data engineers, who create data pipelines and help prepare data sets for analysis.

The analytics process starts with data collection. Data scientists identify the information they need for a particular analytics application, and then work on their own or with data engineers and the IT staff to assemble it for use. Data from different source systems may need to be combined via data integration routines, transformed into a common format and loaded into an analytics system, such as a Hadoop cluster, NoSQL database or data warehouse.

Which of the following are usually good data sources? select all that apply.

In other cases, the collection process may consist of pulling a relevant subset out of a stream of data that flows into, for example, Hadoop. The data is then moved to a separate partition in the system so it can be analyzed without affecting the overall data set.

Once the data that's needed is in place, the next step is to find and fix data quality problems that could affect the accuracy of analytics applications. That includes running data profiling and data cleansing tasks to ensure the information in a data set is consistent and that errors and duplicate entries are eliminated. Additional data preparation work is done to manipulate and organize the data for the planned analytics use. Data governance policies are then applied to ensure that the data follows corporate standards and is being used properly.

From here, a data scientist builds an analytical model, using predictive modeling tools or other analytics software and programming languages such as Python, Scala, R and SQL. Typically, the model is initially run against a partial data set to test its accuracy; it's then revised and tested again as needed. This process is known as "training" the model until it functions as intended. Finally, the model is run in production mode against the full data set, something that can be done once to address a specific information need or on an ongoing basis as the data is updated.

In some cases, analytics applications can be set to automatically trigger business actions. An example is stock trades by a financial services firm. Otherwise, the last step in the data analytics process is communicating the results generated by analytical models to business executives and other end users. Charts and other infographics can be designed to make findings easier to understand. Data visualizations often are incorporated into BI dashboard applications that display data on a single screen and can be updated in real time as new information becomes available.

Data analytics vs. data science

As automation grows, data scientists will focus more on business needs, strategic oversight and deep learning. Data analysts who work in business intelligence will focus more on model creation and other routine tasks. In general, data scientists concentrate efforts on producing broad insights, while data analysts focus on answering specific questions. In terms of technical skills, future data scientists will need to focus more on the machine learning operations process, also called MLOps.

This was last updated in September 2020

Continue Reading About data analytics (DA)

  • Ultimate guide to business intelligence in the enterprise
  • The data science process: 6 key steps on analytics applications
  • Building a strong data analytics platform architecture
  • 6 top business benefits of real-time data analytics

Dig Deeper on Data integration

  • Which of the following are usually good data sources? select all that apply.
    How self-service BI capabilities improve data use

    Which of the following are usually good data sources? select all that apply.

    By: Lisa Morgan

  • Which of the following are usually good data sources? select all that apply.
    Epicor ERP adds low-code BI with Grow acquisition

    Which of the following are usually good data sources? select all that apply.

    By: Jim O'Donnell

  • Which of the following are usually good data sources? select all that apply.
    NXP Semiconductors doubles down on datacentre migration plan with AWS

    By: Caroline Donnelly

  • Which of the following are usually good data sources? select all that apply.
    Startup analytics vendor Einblick emerges from stealth

    Which of the following are usually good data sources? select all that apply.

    By: Eric Avidon

Which of the following are usually good data source?

Vetted public datasets, academic papers, and governmental agency data are usually good data sources.

Which of the following principles are key element of data integrity select all that apply?

According to the ALCOA principle, the data should have the following five qualities to maintain data integrity: Attributable, Legible, Contemporaneous, Original and Accurate.

What are the characteristics of unstructured data select all that apply?

Characteristics of Unstructured Data:.
Data neither conforms to a data model nor has any structure..
Data can not be stored in the form of rows and columns as in Databases..
Data does not follows any semantic or rules..
Data lacks any particular format or sequence..
Data has no easily identifiable structure..

What are the most common processes and procedures handled by data engineers select all that apply 1 point?

Question 2. What are the most common processes and procedures handled by data engineers? Select all that apply. Data engineers transform data into a useful format for analysis; give it a reliable infrastructure; and develop, maintain, and test systems.