Designing And Implementing An Azure AI Solution AI-100 Pack

Our pass rate is high to 98.9% and the similarity percentage between our AI-100 study guide and real exam is 90% based on our seven-year educating experience. Do you want achievements in the Microsoft AI-100 exam in just one try? I am currently studying for the Microsoft AI-100 exam. Latest Microsoft AI-100 Test exam practice questions and answers, Try Microsoft AI-100 Brain Dumps First.

Free AI-100 Demo Online For Microsoft Certifitcation:

Page: 1 / 8
Total 101 questions Full Exam Access
Question 1
- (Exam Topic 2)
You are designing an Azure infrastructure to support an Azure Machine Learning solution that will have multiple phases. The solution must meet the following requirements:
• Securely query an on-premises database once a week to update product lists.
• Access the data without using a gateway.
• Orchestrate the separate phases.
What should you use? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.
AI-100 dumps exhibit
Solution:
Box 1: Azure App Service Hybrid Connections
With Hybrid Connections, Azure websites and mobile services can access on-premises resources as if they were located on the same private network. Application admins thus have the flexibility to simply lift-and-shift specific most front-end tiers to Azure with minimal configuration changes, extending their enterprise apps for hybrid scenarios.
Incorrect Option: The VPN connection solution both use gateways. Box 2: Machine Learning pipelines
Typically when running machine learning algorithms, it involves a sequence of tasks including pre-processing, feature extraction, model fitting, and validation stages. For example, when classifying text documents might involve text segmentation and cleaning, extracting features, and training a classification model with
cross-validation. Though there are many libraries we can use for each stage, connecting the dots is not as easy as it may look, especially with large-scale datasets. Most ML libraries are not designed for distributed computation or they do not provide native support for pipeline creation and tuning.
Box 3: Azure Databricks References:
https://azure.microsoft.com/is-is/blog/hybrid-connections-preview/ https://databricks.com/glossary/what-are-ml-pipelines

Does this meet the goal?
My answer: -
Reference answer: A
Reference analysis:

None

Question 2
- (Exam Topic 2)
You are designing a solution that uses drones to monitor remote locations for anomalies. The drones have Azure loT Edge devices. The solution must meet the following requirements:
•Email a user the picture and location of an anomaly when an anomaly is detected.
•Use a video stream to detect anomalies at the location.
•Send the pictures and location information to Azure.
•Use the least amount of code possible.
You develop a custom vision Azure Machine Learning module to detect the anomalies.
Which service should you use for each requirement? To answer, drag the appropriate services to the correct requirements. Each service may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.
NOTE: Each correct selection is worth one point.
AI-100 dumps exhibit
Solution:
Box 1: Azure IOT Edge Example:
You configure the Remote Monitoring solution to respond to anomalies detected by an IoT Edge device. IoT Edge devices let you process telemetry at the edge to reduce the volume of telemetry sent to the solution and to enable faster responses to events on devices.
Box 2: Azure Functions Box 3: Azure Logic Apps References:
https://docs.microsoft.com/en-us/azure/iot-accelerators/iot-accelerators-remote-monitoring-edge

Does this meet the goal?
My answer: -
Reference answer: A
Reference analysis:

None

Question 3
- (Exam Topic 2)
You are designing an Al solution that must meet the following processing requirements:
• Use a parallel processing framework that supports the in-memory processing of high volumes of data.
• Use in-memory caching and a columnar storage engine for Apache Hive queries.
What should you use to meet each requirement? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.
AI-100 dumps exhibit
Solution:
Box 1: Apache Spark
Apache Spark is a parallel processing framework that supports in-memory processing to boost the performance of big-data analytic applications. Apache Spark in Azure HDInsight is the Microsoft implementation of Apache Spark in the cloud.
Box 2: Interactive Query
Interactive Query provides In-memory caching and improved columnar storage engine for Hive queries. References:
https://docs.microsoft.com/en-us/azure/hdinsight/spark/apache-spark-overview https://docs.microsoft.com/bs-latn-ba/azure/hdinsight/interactive-query/apache-interactive-query-get-started

Does this meet the goal?
My answer: -
Reference answer: A
Reference analysis:

None

Question 4
- (Exam Topic 2)
You have thousands of images that contain text.
You need to process the text from the images into a machine-readable character stream. Which Azure Cognitive Services service should you use?
My answer: -
Reference answer: C
Reference analysis:

With Computer Vision you can detect text in an image using optical character recognition (OCR) and extract the recognized words into a machine-readable character stream.
References:
https://azure.microsoft.com/en-us/services/cognitive-services/computer-vision/ https://docs.microsoft.com/en-us/azure/cognitive-services/content-moderator/image-moderation-api

Question 5
- (Exam Topic 2)
You are designing an AI solution that will provide feedback to teachers who train students over the Internet. The students will be in classrooms located in remote areas. The solution will capture video and audio data of the students in the classrooms.
You need to recommend Azure Cognitive Services for the AI solution to meet the following requirements: Alert teachers if a student seems angry or distracted.
Identify each student in the classrooms for attendance purposes.
Allow the teachers to log the text of conversations between themselves and the students. Which Cognitive Services should you recommend?
My answer: -
Reference answer: E
Reference analysis:

Azure Video Indexer is a cloud application built on Azure Media Analytics, Azure Search, Cognitive Services (such as the Face API, Microsoft Translator, the Computer Vision API, and Custom Speech Service). It enables you to extract the insights from your videos using Video Indexer video and audio models.
Face API enables you to search, identify, and match faces in your private repository of up to 1 million people. The Face API now integrates emotion recognition, returning the confidence across a set of emotions for each face in the image such as anger, contempt, disgust, fear, happiness, neutral, sadness, and surprise. These emotions are understood to be cross-culturally and universally communicated with particular facial expressions.
Speech-to-text from Azure Speech Services, also known as speech-to-text, enables real-time transcription of audio streams into text that your applications, tools, or devices can consume, display, and take action on as command input. This service is powered by the same recognition technology that Microsoft uses for Cortana and Office products, and works seamlessly with the translation and text-to-speech.

Question 6
- (Exam Topic 2)
You need to build an interactive website that will accept uploaded images, and then ask a series of predefined questions based on each image.
Which services should you use? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.
AI-100 dumps exhibit
Solution:
Box 1: Azure Bot Service
Box 2: Computer Vision
The Computer Vision Analyze an image feature, returns information about visual content found in an image. Use tagging, domain-specific models, and descriptions in four languages to identify content and label it with confidence. Use Object Detection to get location of thousands of objects within an image. Apply the adult/racy settings to help you detect potential adult content. Identify image types and color schemes in pictures.
References:
https://azure.microsoft.com/en-us/services/cognitive-services/computer-vision/

Does this meet the goal?
My answer: -
Reference answer: A
Reference analysis:

None

Question 7
- (Exam Topic 2)
You have an Azure Machine Learning experiment that must comply with GDPR regulations. You need to track compliance of the experiment and store documentation about the experiment. What should you use?
My answer: -
Reference answer: D
Reference analysis:

References:
https://azure.microsoft.com/en-us/blog/new-capabilities-to-enable-robust-gdpr-compliance/

Question 8
- (Exam Topic 2)
Your company has recently purchased and deployed 25,000 IoT devices.
You need to recommend a data analysis solution for the devices that meets the following requirements:
AI-100 dumps exhibit Each device must use its own credentials for identity.
AI-100 dumps exhibit Each device must be able to route data to multiple endpoints.
AI-100 dumps exhibit The solution must require the minimum amount of customized code. What should you include in the recommendation?
My answer: -
Reference answer: C
Reference analysis:

An IoT hub has a default built-in endpoint. You can create custom endpoints to route messages to by linking other services in your subscription to the hub.
Individual devices connect using credentials stored in the IoT hub\'s identity registry. References:
https://docs.microsoft.com/en-us/azure/iot-hub/iot-hub-devguide-security

Question 9
- (Exam Topic 2)
You have an Al application that uses keys in Azure Key Vault.
Recently, a key used by the application was deleted accidentally and was unrecoverable. You need to ensure that if a key is deleted, it is retained in the key vault for 90 days. Which two features should you configure? Each correct answer presents part of the solution. NOTE: Each correct selection is worth one point
My answer: -
Reference answer: BC
Reference analysis:

References:
https://docs.microsoft.com/en-us/azure/architecture/best-practices/data-partitioning

Question 10
- (Exam Topic 2)
You create an Azure Cognitive Services resource.
A data scientist needs to call the resource from Azure Logic Apps.
Which two values should you provide to the data scientist? Each correct answer presents part of the solution. NOTE: Each correct selection is worth one point.
My answer: -
Reference answer: DE
Reference analysis:

References:
https://social.technet.microsoft.com/wiki/contents/articles/36074.logic-apps-with-azure-cognitive-service.aspx

Question 11
- (Exam Topic 2)
You are designing a solution that will ingest temperature data from loT devices, calculate the average temperature, and then take action based on the aggregated data. The solution must meet the following requirements:
•Minimize the amount of uploaded data.
• Take action based on the aggregated data as quickly as possible.
What should you include in the solution? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.
AI-100 dumps exhibit
Solution:
Box 1: Azure Functions
Azure Function is a (serverless) service to host functions (little piece of code) that can be used for e. g. event driven applications.
General rule is always difficult since everything depends on your requirement but if you have to analyze a data stream, you should take a look at Azure Stream Analytics and if you want to implement something like a serverless event driven or timer-based application, you should check Azure Function or Logic Apps.
Note: Azure IoT Edge allows you to deploy complex event processing, machine learning, image recognition, and other high value AI without writing it in-house. Azure services like Azure Functions, Azure Stream Analytics, and Azure Machine Learning can all be run on-premises via Azure IoT Edge.
Box 2: An Azure IoT Edge device
Azure IoT Edge moves cloud analytics and custom business logic to devices so that your organization can focus on business insights instead of data management.
References:
https://docs.microsoft.com/en-us/azure/iot-edge/about-iot-edge

Does this meet the goal?
My answer: -
Reference answer: A
Reference analysis:

None

Question 12
- (Exam Topic 2)
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You are deploying an Azure Machine Learning model to an Azure Kubernetes Service (AKS) container. You need to monitor the accuracy of each run of the model.
Solution: You modify the scoring file. Does this meet the goal?
My answer: -
Reference answer: B
Reference analysis:

None

Question 13
- (Exam Topic 2)
You deploy an application that performs sentiment analysis on the data stored in Azure Cosmos DB.
Recently, you loaded a large amount of data to the database. The data was for a customer named Contoso. Ltd. You discover that queries for the Contoso data are slow to complete, and the queries slow the entire
application.
You need to reduce the amount of time it takes for the queries to complete. The solution must minimize costs. What is the best way to achieve the goal? More than one answer choice may achieve the goal. Select the BEST answer.
My answer: -
Reference answer: B
Reference analysis:

References:
https://docs.microsoft.com/en-us/azure/architecture/best-practices/data-partitioning

Question 14
- (Exam Topic 2)
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question, you will NOT be able to return to it. As a result, these questions will not
appear in the review screen.
You have Azure IoT Edge devices that generate streaming data.
On the devices, you need to detect anomalies in the data by using Azure Machine Learning models. Once an anomaly is detected, the devices must add information about the anomaly to the Azure IoT Hub stream. Solution: You deploy Azure Functions as an IoT Edge module.
Does this meet the goal?
My answer: -
Reference answer: B
Reference analysis:

Instead use Azure Stream Analytics and REST API.
Note. Available in both the cloud and Azure IoT Edge, Azure Stream Analytics offers built-in machine learning based anomaly detection capabilities that can be used to monitor the two most commonly occurring anomalies: temporary and persistent.
Stream Analytics supports user-defined functions, via REST API, that call out to Azure Machine Learning endpoints.
References:
https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-machine-learning-anomaly-detection

Question 15
- (Exam Topic 2)
You need to deploy cognitive search. You provision an Azure Search service. What should you do next?
My answer: -
Reference answer: D
Reference analysis:

You create a data source, a skillset, and an index. These three components become part of an indexer that pulls each piece together into a single multi-phased operation.
Note: At the start of the pipeline, you have unstructured text or non-text content (such as image and scanned document JPEG files). Data must exist in an Azure data storage service that can be accessed by an indexer.
Indexers can "crack" source documents to extract text from source data. References:
https://docs.microsoft.com/en-us/azure/search/cognitive-search-tutorial-blob

Question 16
- (Exam Topic 2)
Your company plans to implement an Al solution that will analyse data from loT devices.
Data from the devices will be analysed in real time. The results of the analysis will be stored in a SQL database.
You need to recommend a data processing solution that uses the Transact-SQL language. Which data processing solution should you recommend?
My answer: -
Reference answer: A
Reference analysis:

References:
https://www.linkedin.com/pulse/getting-started-azure-iot-services-stream-analytics-rob-tiffany

Question 17
- (Exam Topic 2)
You are designing an application to parse images of business forms and upload the data to a database. The upload process will occur once a week.
You need to recommend which services to use for the application. The solution must minimize infrastructure costs.
Which services should you recommend? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.
AI-100 dumps exhibit
Solution:
Box 1: Azure Cognitive Services
Azure Cognitive Services include image-processing algorithms to smartly identify, caption, index, and moderate your pictures and videos.
Not: Azure Linguistic Analytics API, which provides advanced natural language processing over raw text. Box 2: Azure Data Factory
The Azure Data Factory (ADF) is a service designed to allow developers to integrate disparate data sources. It is a platform somewhat like SSIS in the cloud to manage the data you have both on-prem and in the cloud.
It provides access to on-premises data in SQL Server and cloud data in Azure Storage (Blob and Tables) and Azure SQL Database.
References:
https://azure.microsoft.com/en-us/services/cognitive-services/
https://www.jamesserra.com/archive/2014/11/what-is-azure-data-factory/

Does this meet the goal?
My answer: -
Reference answer: A
Reference analysis:

None

Question 18
- (Exam Topic 2)
Your company has a data team of Transact-SQL experts.
You plan to ingest data from multiple sources into Azure Event Hubs.
You need to recommend which technology the data team should use to move and query data from Event Hubs to Azure Storage. The solution must leverage the data team’s existing skills.
What is the best recommendation to achieve the goal? More than one answer choice may achieve the goal.
My answer: -
Reference answer: B
Reference analysis:

Event Hubs Capture is the easiest way to automatically deliver streamed data in Event Hubs to an Azure Blob storage or Azure Data Lake store. You can subsequently process and deliver the data to any other storage destinations of your choice, such as SQL Data Warehouse or Cosmos DB.
You to capture data from your event hub into a SQL data warehouse by using an Azure function triggered by an event grid.
Example:
\"AI-100
First, you create an event hub with the Capture feature enabled and set an Azure blob storage as the destination. Data generated by WindTurbineGenerator is streamed into the event hub and is automatically captured into Azure Storage as Avro files.
Next, you create an Azure Event Grid subscription with the Event Hubs namespace as its source and the Azure Function endpoint as its destination.
Whenever a new Avro file is delivered to the Azure Storage blob by the Event Hubs Capture feature, Event Grid notifies the Azure Function with the blob URI. The Function then migrates data from the blob to a SQL data warehouse.
References:
https://docs.microsoft.com/en-us/azure/event-hubs/store-captured-data-data-warehouse

Question 19
- (Exam Topic 2)
You plan to design a solution for an Al implementation that uses data from loT devices.
You need to recommend a data storage solution for the loT devices that meets the following requirements:
•Allow data to be queried in real-time as it streams into the solution.
•Provide the lowest amount of latency for loading data into the solution. What should you include in the recommendation?
My answer: -
Reference answer: D
Reference analysis:

You can use HDInsight to process streaming data that\'s received in real time from a variety of devices. Internet of Things (IoT)
You can use HDInsight to build applications that extract critical insights from data. You can also use Azure Machine Learning on top of that to predict future trends for your business.
By combining enterprise-scale R analytics software with the power of Apache Hadoop and Apache Spark, Microsoft R Server for HDInsight gives you the scale and performance you need. Multi-threaded math libraries and transparent parallelization in R Server handle up to 1000x more data and up to 50x faster speeds than open-source R, which helps you to train more accurate models for better predictions.
References:
https://docs.microsoft.com/en-us/azure/hdinsight/hadoop/apache-hadoop-introduction

Question 20
- (Exam Topic 2)
You deploy an Azure bot.
You need to collect Key Performance Indicator (KPI) data from the bot. The type of data includes:
• The number of users interacting with the bot
• The number of messages interacting with the bot
• The number of messages on different channels received by the bot
• The number of users and messages continuously interacting with the bot What should you configure?
My answer: -
Reference answer: A
Reference analysis:

References:
https://docs.microsoft.com/en-us/azure/sql-database/saas-multitenantdb-adhoc-reporting

Question 21
- (Exam Topic 2)
Your company develops an AI application that is orchestrated by using Kubernetes. You need to deploy the application.
Which three actions should you perform? Each correct answer presents part of the solution. NOTE: Each
correct selection is worth one point.
My answer: -
Reference answer: ABC
Reference analysis:

References:
https://docs.microsoft.com/en-us/azure/aks/tutorial-kubernetes-prepare-app

Question 22
- (Exam Topic 2)
You need to build an A) solution that will be shared between several developers and customers. You plan to write code, host code, and document the runtime all within a single user experience. You build the environment to host the solution.
Which three actions should you perform in sequence next? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.
AI-100 dumps exhibit
Solution:
Step 1: Create an Azure Machine Learning Studio workspace
Step 2: Create a notebook
You can manage notebooks using the UI, the CLI, and by invoking the Workspace API. To create a notebook
AI-100 dumps exhibit Click the Workspace button Workspace Icon or the Home button Home Icon in the sidebar. Do one of the following:
Next to any folder, click the Menu Dropdown on the right side of the text and select Create > Notebook. Create Notebook
In the Workspace or a user folder, click Down Caret and select Create > Notebook.
2. In the Create Notebook dialog, enter a name and select the notebook’s primary language.
3. If there are running clusters, the Cluster drop-down displays. Select the cluster to attach the notebook to.
4. Click Create.
Step 3: Create a new experiment
Create a new experiment by clicking +NEW at the bottom of the Machine Learning Studio window. Select EXPERIMENT > Blank Experiment.
References:
https://docs.azuredatabricks.net/user-guide/notebooks/notebook-manage.html https://docs.microsoft.com/en-us/azure/machine-learning/service/quickstart-run-cloud-notebook

Does this meet the goal?
My answer: -
Reference answer: A
Reference analysis:

None

Question 23
- (Exam Topic 2)
You plan to deploy Azure loT Edge devices that will each store more than 10,000 images locally and classify the images by using a Custom Vision Service classifier. Each image is approximately 5 MB.
You need to ensure that the images persist on the devices for 14 days. What should you use?
My answer: -
Reference answer: B
Reference analysis:

References:
https://docs.microsoft.com/en-us/azure/iot-edge/how-to-store-data-blob

Page: 1 / 8
Total 101 questions Full Exam Access