cancel
Showing results for 
Search instead for 
Did you mean: 
cancel
Showing results for 
Search instead for 
Did you mean: 
Zerobus Ingest: From Lakehouse to Insights Now

Join us for a very special BrickTalks session, where we introduce Zerobus Ingest, part of Lakeflow Connect, designed to streamline your event data ingestion into the lakehouse. We will be joined by Databricks Product Manager Victoria Butka, who will ...

  • 586 Views
  • 9 replies
  • 7 kudos
Thursday
Self-Paced Learning Festival: 09 January - 30 January 2026

Grow Your Skills and Earn Rewards! Mark your calendar: January 09 – January 30, 2026 Join us for a three-week event dedicated to learning, upskilling, and advancing your career in data engineering, analytics, machine learning, and generative AI. ...

  • 187394 Views
  • 682 replies
  • 175 kudos
12-09-2025
🎤 Call for Presentations: Data + AI Summit 2026 is Open!

June 15–18, 2026 Are you building the future with data and AI? Then this is your moment. The Call for Proposals for Data + AI Summit 2026 is officially open, and we want to hear from builders, practitioners, and innovators across the data and AI com...

  • 11165 Views
  • 4 replies
  • 6 kudos
12-15-2025
Level Up with Databricks Specialist Sessions

How to Register & Prepare If you're interested in advancing your skills with Databricks through a Specialist Session, here's a clear guide on how to register and what free courses you can take to prepare effectively. How to Begin Your Learning Path S...

  • 4277 Views
  • 2 replies
  • 9 kudos
10-02-2025
Solution Accelerator Series | LLMs for Customer Service and Support

The LLMs for Customer Service and Support Solution Accelerator helps you quickly build an intelligent, context-aware chatbot on Databricks — boosting agent productivity and elevating customer satisfaction with LLM-powered experiences. Why This Accele...

  • 182 Views
  • 0 replies
  • 1 kudos
Friday
🌟 Community Pulse: Your Weekly Roundup! January 12 – 18, 2026

Another week powered by sharp questions, thoughtful solutions, and meaningful knowledge sharing .From technical deep dives to insightful reads, here’s what shaped the community this week   This Week’s Contributors      These members helped move c...

  • 391 Views
  • 4 replies
  • 3 kudos
Tuesday

Community Activity

Abiola-David
by > Databricks MVP
  • 7 Views
  • 0 replies
  • 0 kudos

New Google Drive Connector for Azure Databricks: Seamless Ingestion Just Got Real

One of the biggest challenges in modern data engineering is bridging the gap between business‑user data sources and enterprise‑grade analytics platforms. Google Drive has long been a favourite for teams collaborating on documents, spreadsheets, and a...

google.PNG
  • 7 Views
  • 0 replies
  • 0 kudos
Artur1
by > New Contributor
  • 26 Views
  • 2 replies
  • 0 kudos

Clouds in Databricks

I'm trying to understand the differences between the clouds that offer Databricks, and I'm interested in GCP. However, I'm unsure if choosing GCP would allow me to use AI functionalities like Agent Bricks. Regardless of the cloud I choose, will I hav...

  • 26 Views
  • 2 replies
  • 0 kudos
Latest Reply
balajij8
New Contributor
  • 0 kudos

Features can be delayed based on the cloud. GCP often lags. You can use Azure/AWS especially for AI features. Used Azure Databricks for 7 years with almost no issues.

  • 0 kudos
1 More Replies
ajay_wavicle
by > New Contributor II
  • 22 Views
  • 1 replies
  • 0 kudos

How to transfer files other than .dbc from one workpace to another workpace?

I am trying to migrate workspace directory files from all folders to another workspace. I tried use terraform provider databricks provider to export but its exporting only dbc notebooks. Am i missing something or is there a better way to do this?

  • 22 Views
  • 1 replies
  • 0 kudos
Latest Reply
stbjelcevic
Databricks Employee
  • 0 kudos

Hi @ajay_wavicle , The Terraform provider’s exporter is great for generating Terraform config for workspace resources, but it isn’t meant to bulk-migrate the actual workspace directory contents. Use the Databricks CLI’s workspace export/import comman...

  • 0 kudos
fly_high_five
by > New Contributor III
  • 308 Views
  • 4 replies
  • 5 kudos

Unable to retrieve catalog, schema, tables using JDBC endpoint of SQL Warehouse

Hi,I am connecting to SQL Warehouse in UC using its JDBC endpoint via DBeaver. However, it doesn't list any catalogs, schemas and tables. I checked the permission of SQL WH by logging to ADB Workspace and queried the table (attached a dummy table exa...

fly_high_five_0-1764770250626.png fly_high_five_1-1764770371607.png fly_high_five_2-1764770788643.png
  • 308 Views
  • 4 replies
  • 5 kudos
Latest Reply
fly_high_five
New Contributor III
  • 5 kudos

Hi @Commitchell ,Thanks for trying out DBeaver connection at your end. Yes, the OAuth is successful and I'm setting same driver properties as you.Today, I got the issue resolved by updating the driver. At the time of posting my question, I was using ...

  • 5 kudos
3 More Replies
Fox19
by > New Contributor III
  • 197 Views
  • 5 replies
  • 3 kudos

CSV Ingestion using Autoloader with single variant column

I've been working on ingesting csv files with varying schemas using Autoloader. Goal is to take the csvs and ingest them into a bronze table that writes each record as a key-value mapping with only the relevant fields for that record. I also want to ...

  • 197 Views
  • 5 replies
  • 3 kudos
Latest Reply
pradeep_singh
New Contributor II
  • 3 kudos

If i understand the problem correctly you are getting extra keys for records from files where the keys actually dont exist . I was not able to reproduce this issue . I am getting diffrent keys , value pairs and no extra keys with null. Can you share ...

  • 3 kudos
4 More Replies
echol
by > New Contributor
  • 214 Views
  • 5 replies
  • 1 kudos

Redeploy Databricks Asset Bundle created by others

Hi everyone,Our team is using Databricks Asset Bundles (DAB) with a customized template to develop data pipelines. We have a core team that maintains the shared infrastructure and templates, and multiple product teams that use this template to develo...

  • 214 Views
  • 5 replies
  • 1 kudos
Latest Reply
pradeep_singh
New Contributor II
  • 1 kudos

development mode deployment gives you your own copy of the workflow . you dont need a separate configuration for each developer . 

  • 1 kudos
4 More Replies
NotCuriosAtAll
by > New Contributor II
  • 20 Views
  • 0 replies
  • 0 kudos

Non deterministic behavior from the cluster

I asked this question a while ago where I explain the cluster that my team uses on databricks. To save you some time, we use an all-purpose Standard D2ads v6 with 8 gigs of ram and 2 cores cluster. We are facing an issue with the memory, which is pin...

  • 20 Views
  • 0 replies
  • 0 kudos
prajwalpoojary
by > New Contributor
  • 24 Views
  • 1 replies
  • 0 kudos

Databricks serverless run init scripts

Hello,I want to read from Snowflake, which is in private network, using Databricks serverless compute.  We use init script to whitelist IP address inside the job/ interactive cluster. Now i am not able to execute init script inside serverless.How to ...

  • 24 Views
  • 1 replies
  • 0 kudos
Latest Reply
juan_maedo
New Contributor III
  • 0 kudos

 Hi  @prajwalpoojary , More than run init scripts, you might know and check all this.Databricks Serverless compute uses dynamically assigned IP ranges each time a resource is attached. These ranges vary depending on the region and cloud provider you ...

  • 0 kudos
Srikanthdata_01
by > New Contributor
  • 58 Views
  • 1 replies
  • 0 kudos

Associate Data Engineering Pathway – Completed All 4 Modules but Only 2 Badges Visible

Hello Team,I have completed all 4 modules required for the Associate Data Engineering Learning Pathway as per the qualification criteria:Data Ingestion with Lakeflow ConnectDeploy Workloads with Lakeflow JobsBuild Data Pipelines with Lakeflow Spark D...

  • 58 Views
  • 1 replies
  • 0 kudos
Latest Reply
cert-ops
Databricks Employee
  • 0 kudos

Hello @Srikanthdata_01 ,  Please file a ticket with our support team so they can investigate and look into the issue for you. Please note that we cannot provide support via community. Thanks & Regards,@cert-ops

  • 0 kudos
Krisna_91
by > New Contributor
  • 19 Views
  • 1 replies
  • 0 kudos

50 % certification voucher

I have completed the PROFESSIONAL DATA ENGINEERING. January event 1st to 31st. when can I expect 50% voucher.

  • 19 Views
  • 1 replies
  • 0 kudos
Latest Reply
Advika
Community Manager
  • 0 kudos

Hello @Krisna_91! To be eligible for the voucher, please ensure that all modules listed under the LEARNING PATHWAY 2: PROFESSIONAL DATA ENGINEERING are completed within Customer Academy during the event window (January 9–30). Vouchers will be sent to...

  • 0 kudos
batch_bender
by > New Contributor
  • 162 Views
  • 2 replies
  • 1 kudos

create_auto_cdc_from_snapshot_flow vs create_auto_cdc_flow – when is snapshot CDC actually worth it?

I am deciding between create_auto_cdc_from_snapshot_flow() and create_auto_cdc_flow() in a pipeline.My source is a daily full snapshot table:No operation column (no insert/update/delete flags)Order can be derived from snapshot_date (sequence by)Rows ...

  • 162 Views
  • 2 replies
  • 1 kudos
Latest Reply
aleksandra_ch
Databricks Employee
  • 1 kudos

Hi @batch_bender , For your case, I recommend using create_auto_cdc_from_snapshot_flow(). Since your system provides full snapshots without row-level operation data, this is the only way to accurately generate SCD tables. How it works: It compares th...

  • 1 kudos
1 More Replies
MandyR
by Community Manager
  • 133 Views
  • 5 replies
  • 29 kudos

Today is Community Manager Appreciation day! Big shout out to the Databricks community team!

Today is Community Manager Appreciation Day, celebrated every year on the fourth Monday of January, a tradition started in 2010. And I want to use it to shine a big, sparkly spotlight on my fellow Databricks Community Managers who and make the whole ...

  • 133 Views
  • 5 replies
  • 29 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 29 kudos

Well done, team! You all work incredibly hard, and your dedication truly shows. Keep up the great work — it’s inspiring to see. Cheers, Louis

  • 29 kudos
4 More Replies
fgeriksen
by > New Contributor
  • 97 Views
  • 3 replies
  • 1 kudos

Enabling External Lineage on a free or trial account?

Hi,as part of a small OSS project I am doing, dbt-unity-lineage, I need to enable Bring your own data lineage (Public Preview as of December 2025). But it seems you can't enable that Preview in either free edition or Trial?I'd rather not use my emplo...

Administration & Architecture
dbt
dbt GA PublicPreview
  • 97 Views
  • 3 replies
  • 1 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 1 kudos

@fgeriksen , if you are satisfied with the response please "Accept as Solution" so that others will be informed as well.  Cheers, Louis.

  • 1 kudos
2 More Replies
AJ270990
by > Contributor II
  • 171 Views
  • 2 replies
  • 1 kudos

Resolved! All purpose cluster, SQL Warehouse and Job Cluster are not executing the code

All purpose cluster, SQL Warehouse and Job Cluster are not executing the spark code in Pro and Classic mode. When switched to Serverless mode they are able to execute the code. When checked with Networking team there were no subnet changes recently. ...

AJ270990_0-1768207712912.png
  • 171 Views
  • 2 replies
  • 1 kudos
Latest Reply
MoJaMa
Databricks Employee
  • 1 kudos

Can you share the actual commands and error messages? Screenshots if you have them.

  • 1 kudos
1 More Replies
gopínath
by Databricks Employee
  • 4934 Views
  • 1 replies
  • 8 kudos

Step-by-Step Guide to Building Custom MCP Server on Databricks

  Introduction One of the biggest challenges with LLMs is bridging the gap between static knowledge and real-world actions. MCP solves this by providing a standard way for models to connect with external tools and data sources. The Model Context Pro...

mcp-latest.drawio.png mcp-deploy.png app-service-principal.png add-tool-in-playground.png
  • 4934 Views
  • 1 replies
  • 8 kudos
Latest Reply
vincentsquinter
  • 8 kudos

I really enjoyed this MCP server consultation post. The step-by-step instructions are excellent.For those looking to improve their MCP server development consultation skills, this resource is very useful: https://mobisoftinfotech.com/services/mcp-ser...

  • 8 kudos
Welcome to the Databricks Community!

Once you are logged in, you will be ready to post content, ask questions, participate in discussions, earn badges and more.

Spend a few minutes exploring Get Started Resources, Learning Paths, Certifications, and Platform Discussions.

Connect with peers through User Groups and stay updated by subscribing to Events. We are excited to see you engage!

Top Kudoed Authors
Read Databricks Data Intelligence Platform reviews on G2

Latest from our Blog