Databricks takes the human intervention out of Spark processing

A new workflow feature for Databricks Cloud can automate routine deployment tasks

Databrick's now offers a way to schedule Spark jobs in the cloud

Databrick's now offers a way to schedule Spark jobs in the cloud

Databricks wants to make it possible to take humans out of the loop entirely when it comes to running complicated data analysis jobs.

The company, which offers a commercial version of Spark , now offers a tool to automate the setting up and executing of analysis written to run on the open source data processing platform.

"You can express very complicated workflows using this thing," said Ali Ghodsi, Databricks' director of engineering. "There is no human in the loop any more."

Founded by several of the original developers of Spark, Databricks offers a commercial version of the platform designed to run on Amazon Web Services and eliminate many of the mundane chores of setting up and maintaining an in-house deployment.

Spark can be used to analyze very large data sets across multiple servers for tasks such as generating recommendations for an Internet service for users, or to predict future revenue of a company.

As customers get more comfortable with using big data, they are increasingly scheduling their analysis jobs to run on a regular basis, requiring an administrator to log into a console to coordinate all the steps needed to run the job.

The new feature for Databricks Cloud, called jobs, provides a way for administrators to set up schedules to run standalone Spark jobs at specified intervals. A user could schedule a Spark application to run on a specific Databricks cloud cluster at a scheduled time. Users can decide whether to use a dedicated cluster for maximum performance, or a cluster shared with other users to save money.

The service notifies the user when the task completes. The service also creates a log detailing if the task was completed successfully or not, and can alert the administrator if something goes awry.

In effect, the feature establishes a way to create a production pipeline, which is a series of jobs that execute automatically and in coordination with each other. An administrator can set up a workflow that executes two Spark jobs at the same time, and wait for both to finish. When both are completed, the workflow can then start another job that uses the results from the first two. If one of the two initial jobs fail, then the entire workflow can be terminated.

Jobs are written in Spark notebooks. Similar to iPython notebooks for Python, Spark notebooks are user-generated packages that contain all the components needed to run an interactive data analysis job across a cluster. Spark Notebooks can be written in Python, Scala, SQL, or a combination of each.

Pricing for Databricks is tiered, based on usage capacity, support model, and feature-set. It will start at several hundred dollars per month.

Joab Jackson covers enterprise software and general technology breaking news for The IDG News Service. Follow Joab on Twitter at @Joab_Jackson. Joab's e-mail address is Joab_Jackson@idg.com

Join the newsletter!

Error: Please check your email address.
Rocket to Success - Your 10 Tips for Smarter ERP System Selection

Tags applicationsDatabricksdata miningsoftware

Keep up with the latest tech news, reviews and previews by subscribing to the Good Gear Guide newsletter.

Joab Jackson

IDG News Service
Show Comments

Cool Tech

Breitling Superocean Heritage Chronographe 44

Learn more >

SanDisk MicroSDXC™ for Nintendo® Switch™

Learn more >

Toys for Boys

Family Friendly

Panasonic 4K UHD Blu-Ray Player and Full HD Recorder with Netflix - UBT1GL-K

Learn more >

Stocking Stuffer

Razer DeathAdder Expert Ergonomic Gaming Mouse

Learn more >

Christmas Gift Guide

Click for more ›

Most Popular Reviews

Latest Articles

Resources

PCW Evaluation Team

Ben Ramsden

Sharp PN-40TC1 Huddle Board

Brainstorming, innovation, problem solving, and negotiation have all become much more productive and valuable if people can easily collaborate in real time with minimal friction.

Sarah Ieroianni

Brother QL-820NWB Professional Label Printer

The print quality also does not disappoint, it’s clear, bold, doesn’t smudge and the text is perfectly sized.

Ratchada Dunn

Sharp PN-40TC1 Huddle Board

The Huddle Board’s built in program; Sharp Touch Viewing software allows us to easily manipulate and edit our documents (jpegs and PDFs) all at the same time on the dashboard.

George Khoury

Sharp PN-40TC1 Huddle Board

The biggest perks for me would be that it comes with easy to use and comprehensive programs that make the collaboration process a whole lot more intuitive and organic

David Coyle

Brother PocketJet PJ-773 A4 Portable Thermal Printer

I rate the printer as a 5 out of 5 stars as it has been able to fit seamlessly into my busy and mobile lifestyle.

Kurt Hegetschweiler

Brother PocketJet PJ-773 A4 Portable Thermal Printer

It’s perfect for mobile workers. Just take it out — it’s small enough to sit anywhere — turn it on, load a sheet of paper, and start printing.

Featured Content

Latest Jobs

Don’t have an account? Sign up here

Don't have an account? Sign up now

Forgot password?