Verified Microsoft AI-100 Exam Online

Cause all that matters here is passing the Microsoft AI-100 exam. Cause all that you need is a high score of AI-100 Designing and Implementing an Azure AI Solution exam. The only one thing you need to do is downloading Exambible AI-100 exam study guides now. We will not let you down with our money-back guarantee.

Microsoft AI-100 Free Dumps Questions Online, Read and Test Now.

Page: 1 / 8
Total 101 questions Full Exam Access
Question 1
- (Exam Topic 2)
You are designing an Al solution that must meet the following processing requirements:
• Use a parallel processing framework that supports the in-memory processing of high volumes of data.
• Use in-memory caching and a columnar storage engine for Apache Hive queries.
What should you use to meet each requirement? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.
AI-100 dumps exhibit
Solution:
Box 1: Apache Spark
Apache Spark is a parallel processing framework that supports in-memory processing to boost the performance of big-data analytic applications. Apache Spark in Azure HDInsight is the Microsoft implementation of Apache Spark in the cloud.
Box 2: Interactive Query
Interactive Query provides In-memory caching and improved columnar storage engine for Hive queries. References:
https://docs.microsoft.com/en-us/azure/hdinsight/spark/apache-spark-overview https://docs.microsoft.com/bs-latn-ba/azure/hdinsight/interactive-query/apache-interactive-query-get-started

Does this meet the goal?
My answer: -
Reference answer: A
Reference analysis:

None

Question 2
- (Exam Topic 2)
You are developing an application that will perform optical character recognition of photos of medical logbooks. You need to recommend a solution to validate the data against a validated set of records. Which service should you include in the recommendation?
My answer: -
Reference answer: D
Reference analysis:

References:
https://docs.microsoft.com/en-us/sql/master-data-services/validation-master-data-services?view=sql-server-2017

Question 3
- (Exam Topic 2)
You plan to implement a new data warehouse for a planned AI solution. You have the following information regarding the data warehouse:
•The data files will be available in one week.
•Most queries that will be executed against the data warehouse will be ad-hoc queries.
•The schemas of data files that will be loaded to the data warehouse will change often.
•One month after the planned implementation, the data warehouse will contain 15 TB of data. You need to recommend a database solution to support the planned implementation.
What two solutions should you include in the recommendation? Each correct answer is a complete solution. NOTE: Each correct selection is worth one point.
My answer: -
Reference answer: AB
Reference analysis:

None

Question 4
- (Exam Topic 2)
You have Azure loT Edge devices that collect measurements every 30 seconds. You plan to send the measurements to an Azure loT hub. You need to ensure that every event is processed as quickly as possible. What should you use?
My answer: -
Reference answer: A
Reference analysis:

References:
https://docs.microsoft.com/en-us/azure/hdinsight/kafka/apache-kafka-connector-iot-hub

Question 5
- (Exam Topic 2)
You plan to design a solution for an Al implementation that uses data from loT devices.
You need to recommend a data storage solution for the loT devices that meets the following requirements:
•Allow data to be queried in real-time as it streams into the solution.
•Provide the lowest amount of latency for loading data into the solution. What should you include in the recommendation?
My answer: -
Reference answer: D
Reference analysis:

You can use HDInsight to process streaming data that's received in real time from a variety of devices. Internet of Things (IoT)
You can use HDInsight to build applications that extract critical insights from data. You can also use Azure Machine Learning on top of that to predict future trends for your business.
By combining enterprise-scale R analytics software with the power of Apache Hadoop and Apache Spark, Microsoft R Server for HDInsight gives you the scale and performance you need. Multi-threaded math libraries and transparent parallelization in R Server handle up to 1000x more data and up to 50x faster speeds than open-source R, which helps you to train more accurate models for better predictions.
References:
https://docs.microsoft.com/en-us/azure/hdinsight/hadoop/apache-hadoop-introduction

Question 6
- (Exam Topic 2)
Your company recently deployed several hardware devices that contain sensors.
The sensors generate new data on an hourly basis. The data generated is stored on-premises and retained for several years.
During the past two months, the sensors generated 300 GB of data.
You plan to move the data to Azure and then perform advanced analytics on the data. You need to recommend an Azure storage solution for the data.
Which storage solution should you recommend?
My answer: -
Reference answer: C
Reference analysis:

References:
https://docs.microsoft.com/en-us/azure/architecture/data-guide/technology-choices/data-storage

Question 7
- (Exam Topic 1)
You need to recommend a data storage solution that meets the technical requirements.
What is the best data storage solution to recommend? More than one answer choice may achieve the goal. Select the BEST answer.
My answer: -
Reference answer: B
Reference analysis:

References:
https://docs.microsoft.com/en-us/azure/architecture/example-scenario/ai/commerce-chatbot

Question 8
- (Exam Topic 1)
You need to integrate the new Bookings app and the Butler chatbot.
Which four actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.
AI-100 dumps exhibit
Solution:
References:
https://docs.microsoft.com/en-us/azure/bot-service/bot-service-channel-connect-webchat?view=azure-bot-servic

Does this meet the goal?
My answer: -
Reference answer: A
Reference analysis:

None

Question 9
- (Exam Topic 2)
You are designing a solution that will ingest data from an Azure loT Edge device, preprocess the data in Azure Machine Learning, and then move the data to Azure HDInsight for further processing.
What should you include in the solution? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.
AI-100 dumps exhibit
Solution:
Box 1: Export Data
The Export data to Hive option in the Export Data module in Azure Machine Learning Studio. This option is useful when you are working with very large datasets, and want to save your machine learning experiment data to a Hadoop cluster or HDInsight distributed storage.
Box 2: Apache Hive
Apache Hive is a data warehouse system for Apache Hadoop. Hive enables data summarization, querying, and analysis of data. Hive queries are written in HiveQL, which is a query language similar to SQL.
Box 3: Azure Data Lake
Default storage for the HDFS file system of HDInsight clusters can be associated with either an Azure Storage account or an Azure Data Lake Storage.
References:
https://docs.microsoft.com/en-us/azure/machine-learning/studio-module-reference/export-to-hive-query https://docs.microsoft.com/en-us/azure/hdinsight/hadoop/hdinsight-use-hive

Does this meet the goal?
My answer: -
Reference answer: A
Reference analysis:

None

Question 10
- (Exam Topic 2)
You are designing a solution that uses drones to monitor remote locations for anomalies. The drones have Azure loT Edge devices. The solution must meet the following requirements:
•Email a user the picture and location of an anomaly when an anomaly is detected.
•Use a video stream to detect anomalies at the location.
•Send the pictures and location information to Azure.
•Use the least amount of code possible.
You develop a custom vision Azure Machine Learning module to detect the anomalies.
Which service should you use for each requirement? To answer, drag the appropriate services to the correct requirements. Each service may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.
NOTE: Each correct selection is worth one point.
AI-100 dumps exhibit
Solution:
Box 1: Azure IOT Edge Example:
You configure the Remote Monitoring solution to respond to anomalies detected by an IoT Edge device. IoT Edge devices let you process telemetry at the edge to reduce the volume of telemetry sent to the solution and to enable faster responses to events on devices.
Box 2: Azure Functions Box 3: Azure Logic Apps References:
https://docs.microsoft.com/en-us/azure/iot-accelerators/iot-accelerators-remote-monitoring-edge

Does this meet the goal?
My answer: -
Reference answer: A
Reference analysis:

None

Question 11
- (Exam Topic 2)
You have thousands of images that contain text.
You need to process the text from the images into a machine-readable character stream. Which Azure Cognitive Services service should you use?
My answer: -
Reference answer: C
Reference analysis:

With Computer Vision you can detect text in an image using optical character recognition (OCR) and extract the recognized words into a machine-readable character stream.
References:
https://azure.microsoft.com/en-us/services/cognitive-services/computer-vision/ https://docs.microsoft.com/en-us/azure/cognitive-services/content-moderator/image-moderation-api

Question 12
- (Exam Topic 2)
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question, you will NOT be able to return to it. As a result, these questions will not
appear in the review screen.
You have Azure IoT Edge devices that generate streaming data.
On the devices, you need to detect anomalies in the data by using Azure Machine Learning models. Once an anomaly is detected, the devices must add information about the anomaly to the Azure IoT Hub stream. Solution: You deploy Azure Functions as an IoT Edge module.
Does this meet the goal?
My answer: -
Reference answer: B
Reference analysis:

Instead use Azure Stream Analytics and REST API.
Note. Available in both the cloud and Azure IoT Edge, Azure Stream Analytics offers built-in machine learning based anomaly detection capabilities that can be used to monitor the two most commonly occurring anomalies: temporary and persistent.
Stream Analytics supports user-defined functions, via REST API, that call out to Azure Machine Learning endpoints.
References:
https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-machine-learning-anomaly-detection

Question 13
- (Exam Topic 2)
You need to build an API pipeline that analyzes streaming data. The pipeline will perform the following:
AI-100 dumps exhibit Visual text recognition
AI-100 dumps exhibit Audio transcription
AI-100 dumps exhibit Sentiment analysis
AI-100 dumps exhibit Face detection
Which Azure Cognitive Services should you use in the pipeline?
My answer: -
Reference answer: D
Reference analysis:

Azure Video Indexer is a cloud application built on Azure Media Analytics, Azure Search, Cognitive Services (such as the Face API, Microsoft Translator, the Computer Vision API, and Custom Speech Service). It enables you to extract the insights from your videos using Video Indexer video and audio models described below:
Visual text recognition (OCR): Extracts text that is visually displayed in the video. Audio transcription: Converts speech to text in 12 languages and allows extensions.
Sentiment analysis: Identifies positive, negative, and neutral sentiments from speech and visual text. Face detection: Detects and groups faces appearing in the video.
References:
https://docs.microsoft.com/en-us/azure/media-services/video-indexer/video-indexer-overview

Question 14
- (Exam Topic 2)
You develop a custom application that uses a token to connect to Azure Cognitive Services resources. A new security policy requires that all access keys are changed every 30 days.
You need to recommend a solution to implement the security policy.
Which three actions should you recommend be performed every 30 days? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.
AI-100 dumps exhibit
Solution:
Step 1: Generate new keys in the Cognitive Service resources
AI-100 dumps exhibit
Step 2: Retrieve a token from the Cognitive Services endpoint Step 3: Update the custom application to use the new authorization
Each request to an Azure Cognitive Service must include an authentication header. This header passes along a subscription key or access token, which is used to validate your subscription for a service or group of services.
References:
https://docs.microsoft.com/en-us/azure/cognitive-services/authentication

Does this meet the goal?
My answer: -
Reference answer: A
Reference analysis:

None

Question 15
- (Exam Topic 2)
You need to build a sentiment analysis solution that will use input data from JSON documents and PDF documents. The JSON documents must be processed in batches and aggregated.
Which storage type should you use for each file type? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
AI-100 dumps exhibit
Solution:
References:
https://docs.microsoft.com/en-us/azure/architecture/data-guide/big-data/batch-processing

Does this meet the goal?
My answer: -
Reference answer: A
Reference analysis:

None

Question 16
- (Exam Topic 2)
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You create several AI models in Azure Machine Learning Studio. You deploy the models to a production environment.
You need to monitor the compute performance of the models. Solution: You write a custom scoring script.
Does this meet the goal?
My answer: -
Reference answer: B
Reference analysis:

You need to enable Model data collection. References:
https://docs.microsoft.com/en-us/azure/machine-learning/service/how-to-enable-data-collection

Question 17
- (Exam Topic 1)
Which RBAC role should you assign to the KeyManagers group?
My answer: -
Reference answer: A
Reference analysis:

References:
https://docs.microsoft.com/en-us/azure/role-based-access-control/built-in-roles

Question 18
- (Exam Topic 2)
Your company plans to implement an Al solution that will analyse data from loT devices.
Data from the devices will be analysed in real time. The results of the analysis will be stored in a SQL database.
You need to recommend a data processing solution that uses the Transact-SQL language. Which data processing solution should you recommend?
My answer: -
Reference answer: A
Reference analysis:

References:
https://www.linkedin.com/pulse/getting-started-azure-iot-services-stream-analytics-rob-tiffany

Question 19
- (Exam Topic 2)
You are designing a solution that will use the Azure Content Moderator service to moderate user-generated content.
You need to moderate custom predefined content without repeatedly scanning the collected content. Which API should you use?
My answer: -
Reference answer: A
Reference analysis:

The default global list of terms in Azure Content Moderator is sufficient for most content moderation needs. However, you might need to screen for terms that are specific to your organization. For example, you might
want to tag competitor names for further review.
Use the List Management API to create custom lists of terms to use with the Text Moderation API. The Text - Screen operation scans your text for profanity, and also compares text against custom and shared blacklists.

Question 20
- (Exam Topic 2)
You are designing an AI solution that will analyze millions of pictures.
You need to recommend a solution for storing the pictures. The solution must minimize costs. Which storage solution should you recommend?
My answer: -
Reference answer: C
Reference analysis:

Data Lake will be a bit more expensive although they are in close range of each other. Blob storage has more options for pricing depending upon things like how frequently you need to access your data (cold vs hot storage).
References:
http://blog.pragmaticworks.com/azure-data-lake-vs-azure-blob-storage-in-data-warehousing

Page: 1 / 8
Total 101 questions Full Exam Access