In today’s world, data is growing exponentially with time with digitalization. Organizations are using various cloud platforms like Azure, GCP, etc., to store and analyze this data to get valuable business insights from it. You will study top 11 azure interview questions in this article which will discuss different data services like Azure Cosmos DB, Azure SQL Database, Azure Data Lake Storage, etc., for storing structured, unstructured, or semi-structured data. Let’s take a look at the below azure interview questions.
Learning Objectives
In this article, we will learn about the below azure interview questions:
This article was published as a part of the Data Science Blogathon.
The request unit is a performance currency that abstracts the system resources required to perform the database operations like read, insert, update, etc., supported by Azure Cosmos DB. Consumed Request Units get charged depending upon the Azure Cosmos DB account creation mode.
Time to Live (TTL) in Cosmos DB allows automatically deleting items inside a container after a certain time period by consuming left-over Request Units. Example of configuring TTL in existing Cosmos DB Container:
Select Container-> in settings scroll to Time to Live-> select On and specify TTL value in seconds-> Save.
Azure Cosmos DB offers various database APIs like MongoDB, Cassandra, Gremlin, NoSQL, and Table. Azure Cosmos DB API for NoSQL provides the option to query items using SQL syntax and offers performance isolation and analytical support. Azure Cosmos DB API for MongoDB provides multiple write locations, automatic shard management, and stores data in document structure using BSON format. Azure Cosmos DB for PostgreSQL is used for storing data in PostgreSQL. Azure Cosmos DB API for Cassandra supports horizontal scaling to store extensive data using column-oriented schema. Example of creating a database school using Azure Cosmos DB API for NoSQL in .NET:
Database database1 = await client.CreateDatabaseAsync(
id: "school"
)
For example, insert a single document in the collection named student using Azure Cosmos DB API for MongoDB in JavaScript:
db.student.insertOne({
name:"Chaitanya Shah",
age: 23,
address: "24, Wall Colony"
});
Depending on the deployment model of Azure SQL Database, below are the two purchasing models available:
a. vCore Purchasing Model: The vCore purchasing model allows the users to choose hardware physical characteristics based on their application needs. In this model, customers can independently choose to scale storage, compute resources, etc.
b. DTU-based Purchasing Model: Database Transaction Unit (DTU)–based purchase model provides customers service tiers that are differentiated based on the fixed compute size, storage, read-write rates, and retention period for back-ups.
Below are the two deployment models provided by Azure SQL Database:
a. Single Database: Single database type deployment model creates a database with a dedicated database engine, its own set of resources, performance monitoring, and service tiers.
b. Elastic Pool: Elastic pool type deployment model enables the customers to purchase resources for a pool shared by multiple databases. We can add or remove databases from the pool based on resource utilization. An elastic pool solves the problem of resource overprovisioning and under-provisioning.
While working on project ABC, you created an Azure Data Lake Storage Gen2 account abc_account for storing application and infrastructure logs. The designated retention period for storing application and infrastructure logs is 360 days and 60 days, respectively. As per the current expectations, the logs will not be accessed during the retention periods. Design a solution for the abc_account that will minimize storage costs and automatically delete the logs at the end of each retention period.
Use the archive access tier to store application logs and the cool access tier to store infrastructure logs to minimize the storage costs while storing logs in abc_account. For automatically deleting the logs at the end of each retention period, use Azure Blob storage lifecycle management rules.
Azure Storage Service provides highly scalable, accessible, secure, and managed services to store objects, blob, create data
lakes, file sharing, etc. Below are the Azure Storage Data services:
Write the lifecycle policy rule in Azure Blob Storage to transition the block blobs prefixed with container/school or container/college that haven’t been modified in 90 days to the archive tier and blobs not modified over 30 days cool storage tier.
Below is the lifecycle policy rule for the above scenario:
{
"rules": [
{
"name": "agingPolicy",
"enabled": true,
"type": "Lifecycle",
"definition": {
"filters": {
"blobTypes": [ "blockBlob"],
"prefixMatch": [ " container/school ", " container/college " ]
},
"actions": {
"baseBlob": {
"tierToCool": { "daysAfterModificationGreaterThan": 30 },
"tierToArchive": { "daysAfterModificationGreaterThan": 90 }
}
}
}
}
]
}
The below query will create a table named Depts with the columns DeptNo, DName, and Location:
CREATE TABLE Depts(
DeptNo int Primary Key,
DName nvarchar(50) NOT NULL,
Location nvarchar(50)
);
Here, DeptNo is Primary Key.
You should use Azure Blob storage to create a data lake for big data analytics. Azure Blob storage allows users to store
unstructured data using blobs. Azure Blob storage provides high security, scalability, data availability, and disaster-recovery capabilities.
Microsoft Azure offers data services like Azure Cosmos DB, Azure SQL Database, Azure Data Lake Storage, etc. for storing structured, unstructured, or semi-structured data. Azure Cosmos DB is a multi-model, fully managed, NoSQL database for modern application development. Azure Storage Service provides highly scalable, accessible, secure, and managed services to store objects, blob, create data lakes, file sharing, etc.
Top companies like Mercedes-Benz, Deloitte, PwC, Accenture, TCS, Razorpay, Swiggy, Uber, etc., are hiring for job profiles related to Azure Data Services skills such as Data Engineer, Data Scientist, R&D-related data roles, etc. at various locations across the world. These job profiles have a wide scope in terms of salary, getting challenging work environments, and solving real-world problems. A working professional requires intensive knowledge of Azure SQL, Azure Data Lake development, developing APIs using Cosmos DB as a database, creating data pipelines using Azure Data Factory, etc., to work in this job.
Below are some important points from the above article on azure interview questions:
I hope you liked my article on azure interview questions. Share your feedback with me in the comments section.
The media shown in this article is not owned by Analytics Vidhya and is used at the Author’s discretion.