#75: Hacking your way to Azure Databricks notebook — advice from a lazy developer

Hang Nguyen
5 min readAug 3

I have been working with Azure Databricks for a while now. Gotta say that I love working with this service, but I just can not get myself familiar with the new user interface. May be it will take a bit more time XD. Anyway, while working with Databricks notebook, I realized that so many people including me few months ago did not put the attention on elevating user experience or productivity and rather focused on coding. This was a huge mistake for a new Databricks user, who would later continue using it for a longer period of time. Why? If you are using Azure Databricks almost daily, this post will explain it.

Knowing tips and best practices for using Databricks notebook can greatly enhance the user experience and productivity. Here’s how:

  1. Improved Efficiency: Learning keyboard shortcuts, leveraging shortcuts, and using Markdown effectively allows users to perform actions faster and navigate the notebook more efficiently. This saves time and makes the overall experience smoother.
  2. Readability and Organization: Organizing code into logical sections, using Markdown for explanations, and adding headings improve the readability of notebooks. Well-structured notebooks are easier to understand and maintain, especially for collaboration.
  3. Documentation and Context: Markdown cells allow users to add context, explanations, and documentation to their code. This helps other users (or even the same user in the future) understand the purpose and methodology of the notebook.
  4. Version Control and Collaboration: Using version control allows multiple users to work collaboratively on the same notebook without fear of overwriting each other’s work. It enables users to roll back to previous versions if needed. Even better, user can work on the same notebook in different environment: dev, staging, production without downloading or using own workspace and still push latest code as usual.
  5. Optimized Queries: Knowing how to optimize queries and use DataFrame transformations can significantly improve the performance of data processing and analysis. This leads to faster execution and better resource utilization.
  6. Scheduled Automation: Scheduling and automating notebook jobs ensure that repetitive tasks are performed at specific intervals without manual intervention. This increases productivity and consistency.
Hang Nguyen

Data Engineer based in Finland. Certified across: Databricks, Snowflake, AWS, Microsoft Azure.