Scientists have been developing probabilistic programing languages that could do in 50 lines of code what could take thousands in most conditions.

Recent advances, such as apps that convert speech into text, employ the concept of "machine learning," in which computers are presented with large datasets, MIT reported. Probabilistic programing languages allow researchers to "mix and match" machine-learning techniques that have worked in past ventures.

 A team of MIT researchers will demonstrate short standard computer-vision tasks written in probabilistic programing languages.

"This is the first time that we're introducing probabilistic programming in the vision area," said Tejas Kulkarni, an MIT graduate student in brain and cognitive sciences and first author on the new paper. "The whole hope is to write very flexible models, both generative and discriminative models, as short probabilistic code, and then not do anything else. General-purpose inference schemes solve the problems."

One of the tasks the researchers are investigating is how to construct 3-D models of a human face from 2-D images. The program looks at the human face as two symmetrically distributed objects (the eyes) with two more central features (the nose and mouth).  The program requires minimal work to translate the 2-D description into the "syntax" of probabilistic programming language, and once this occurs the model is complete.

"When you think about probabilistic programs, you think very intuitively when you're modeling," Kulkarni said. "You don't think mathematically. It's a very different style of modeling."

Probabilistic programming language relies on an inference algorithm, which continuously adjusts to probabilities based on newly-introduced training data. The newly-developed language, dubbed Picture, contains several inference algorithms that have worked well in past computer-vision tasks; the program can try out each one on a given program and determine which one works best.

"Using learning to improve inference will be task-specific, but probabilistic programming may alleviate re-writing code across different problems," Kulkarni said. "The code can be generic if the learning machinery is powerful enough to learn different strategies for different tasks."