Are you a candidate looking to sharpen your skills? Try our practice tests or view sample interview questions & Get hired by the best. Click Here×

Azure Data Factory Online Test

Topics Covered

  • Deployment
  •  Rest API
  • Custom Activity Development
  • Migration
  •  Features
  • Data Management Gateway

Useful for hiring

  • Azure Data Developer
  • Azure Developer- Data Factory
  • Sr. Software Developer - Azure Data Factory

Start hiring job-fit candidates using this assessment


"We are impressed with Interview Mocha's comprehensive IT skill test batteries and their ability to provide tailor-made assessments quickly. The results we got from Interview Mocha exceeded our expectations, employees found the tests quick, fun, and a benchmark of their knowledge."

Ajay Garg, Siemens,
Associate Manager

I want to slice the blob based on month, year, day. I have files named as myblobcontainer/log_20151231_144229.csv which is clearly YYYYMMDD_HHMMSS.

I want to process files hourly, not re-process anything, and ideally not have to work around too much restructuring my blobs. How should I accomplish this scenario?

    • Set folder path as $$Text.Format('myblobcontainer/log_{YYYYMMDD_HHMMSS}', WindowStart)

    • Set activity source as $$Text.Format('myblobcontainer/log_{0:yyyyMMdd}', WindowStart)

    • Set folder path as $$Text.Format('myblobcontainer/log_{0:yyyyMMdd}', WindowStart)

    • Create a custom C# activity to parse the files before copying


I have recently started working on Azure Data Factory and currently exploring the JSON structures of various objects in Data Factory. I'm currently looking at the JSON structure for defining the pipeline. Which of the following properties are non-mandatory in pipeline JSON definition?

Note: There can be multiple correct answers to this question.

    • name

    • description

    • expirationTime

    • dataSets


I have created a pipeline now I want to deploy it to 3 different environments (Dev, Test, Prod). How to dynamically achieve this functionality without changing the JSON code? Which is the optimal solution?

    • Create separate custom C# utility to replace the creds and parameters as per environments

    • Create a separate version of all objects for all 3 environments

    • Create 3 configuration file in Visual Studio which will have creds and parameters respective to environments

    • Create PowerShell deploy script to replace string values in JSON dynamically

Good News! You can customize this test as per your requirement

  • Choose and add questions from Interview Mocha question libraries

  • Add your own set of questions

  • Customize test settings like duration, number of questions, passing score, web proctoring and much more

  • Mail us at to create a custom test

Looking for a tailor-made test, to suit your assessment needs?

Get in touch

Global companies using Mocha assessments

  • altran
  • credit suisse
  • sephora
  • nielsen
  • capgemini
  • akamai

How Mocha assessments are brewed

Speed up your IT recruitment with Interview Mocha

  • 500+ IT skill tests
  • 20+ coding languages
  • 50+ domain knowledge tests
  • 100+ digital 2.0 skill tests
  • Customer focus aptitude tests
  • Enterprise ready features & more...

Why hiring managers and recruiters across the globe love Interview Mocha

Take your first step to hire job fit candidates