Google program can automatically caption photos

A neural network-based system scores above average in writing natural captions

“Two pizzas sitting on top of a stove top oven," is how a Google program described this image.

“Two pizzas sitting on top of a stove top oven," is how a Google program described this image.

Next time you're stumped when trying to write a photo caption, try Google.

The search giant has developed a machine-learning system that can automatically and accurately write captions for photos, according to a Google Research Blog post.

The innovation could make it easier to search for images on Google, help visually impaired people understand image content and provide alternative text for images when Internet connections are slow.

In a paper posted on arXiv, Google researchers Oriol Vinyals, Alexander Toshev, Samy Bengio and Dumitru Erhan described how they developed a captioning system called Neural Image Caption (NIC).

NIC is based on techniques from the field of computer vision, which allows machines to see the world, and natural language processing, which tries to make human language meaningful to computers.

The researchers used two different kinds of artificial neural networks, which are biologically inspired computer models. One of the networks encoded the image into a compact representation, while the other network generated a sentence to describe it.

The researchers' goal was to train the system to produce natural-sounding captions based on the objects it recognizes in the images.

NIC produced accurate results such as "A group of people shopping at an outdoor market" for a photo of a market, but also turned out a number of captions with minor mistakes, such as an image of three dogs that it captioned as two dogs, as well as major errors, including a picture of a roadside sign that it described as a refrigerator.

Still, the NIC model scored 59 on a particular dataset in which the state of the art is 25 and higher scores are better, according to the researchers, who added that humans score around 69. The performance was evaluated using a ranking algorithm that compares the quality of text generated by a machine with that generated by a human.

"It is clear from these experiments that, as the size of the available datasets for image description increases, so will the performance of approaches like NIC," the researchers wrote.

Join the Good Gear Guide newsletter!

Error: Please check your email address.

Tags Googleinternetsearch engines

Our Back to Business guide highlights the best products for you to boost your productivity at home, on the road, at the office, or in the classroom.

Keep up with the latest tech news, reviews and previews by subscribing to the Good Gear Guide newsletter.

Tim Hornyak

IDG News Service
Show Comments

Most Popular Reviews

Latest News Articles


GGG Evaluation Team

Kathy Cassidy


First impression on unpacking the Q702 test unit was the solid feel and clean, minimalist styling.

Anthony Grifoni


For work use, Microsoft Word and Excel programs pre-installed on the device are adequate for preparing short documents.

Steph Mundell


The Fujitsu LifeBook UH574 allowed for great mobility without being obnoxiously heavy or clunky. Its twelve hours of battery life did not disappoint.

Andrew Mitsi


The screen was particularly good. It is bright and visible from most angles, however heat is an issue, particularly around the Windows button on the front, and on the back where the battery housing is located.

Simon Harriott


My first impression after unboxing the Q702 is that it is a nice looking unit. Styling is somewhat minimalist but very effective. The tablet part, once detached, has a nice weight, and no buttons or switches are located in awkward or intrusive positions.

Featured Content

Latest Jobs

Don’t have an account? Sign up here

Don't have an account? Sign up now

Forgot password?