We are excited to bring Transform 2022 back in-person July 19 and virtually July 20 - 28. Join AI and data leaders for insightful talks and exciting networking opportunities. Register today!


Researchers at Google Brain, one of Google’s AI research divisions, developed an automated tool for programming in machine learning frameworks like TensorFlow. They say it achieves better-than-human performance on some challenging development tasks, taking seconds to solve problems that take human programmers minutes to hours.

Emerging AI techniques have resulted in breakthroughs across computer vision, audio processing, natural language processing, and robotics. Playing an important role are machine learning frameworks like TensorFlow, Facebook’s PyTorch, and MXNet, which enable researchers to develop and refine new models. But while these frameworks have eased the iterating and training of AI models, they have a steep learning curve because the paradigm of computing over tensors is quite different from traditional programming. (Tensors are algebraic objects that describe relationships between sets of things related to a vector space, and they’re a convenient data format in machine learning.) Most models require various tensor manipulations for data processing or cleaning, custom loss functions, and accuracy metrics that must be implemented within the constraints of a framework.

The researchers’ TF-Coder tool aims to synthesize tensor manipulation programs from input and output examples and natural language descriptions. Per-operation weights allow TF-Coder to enumerate over TensorFlow expressions in order of increasing complexity, while a novel type- and value-based filtering system handles constraints imposed by the TensorFlow library. A separate framework combines predictions from multiple independent machine learning models that choose operations to prioritize during operations searches, conditioned on features of the input and output tensors and the natural language description of a task. This helps tailor the searches to fit the particular synthesis task at hand.

TF-Coder

Event

Transform 2022

Join us at the leading event on applied AI for enterprise business and technology decision makers in-person July 19 and virtually from July 20-28.

Register Here

TF-Coder considers 134 tensor-manipulation operations of the 500 in TensorFlow including reshapes, filters, aggregations, maps, indexing, slicing, grouping, sorting, and mathematical operations. It’s able to handle problems involving compositions of four or five different operations and data structures of 10 or more components, which have little room for error as the shapes and data types must be compatible throughout.

The coauthors say that in experiments, TF-Coder achieved “superhuman” performance on a range of real problems from question-and-answer site StackOverflow. Evaluated on 70 real-world tensor transformation tasks from StackOverflow and from a production environment, TF-Coder successfully synthesized solutions to 63 tasks in 17 seconds on average and led to “significantly” faster synthesis times (35.4% faster on average) compared with not using models. Remarkably, TF-Coder also produced solutions that the coauthors claim were “simpler” and “more elegant” than those written by TensorFlow experts — two solutions required fewer operations than the best handwritten solutions.

“We believe that TF-Coder can help both machine learning beginners and experienced practitioners in writing tricky tensor transformation programs that are common in deep learning pipelines,” the coauthors wrote a preprint paper describing TF-Coder. “Perhaps the most important lesson to be learned from this work is simply the fact that a well-optimized enumerative search can successfully solve real-world tensor manipulation problems within seconds, even on problems that human programmers struggle to solve within minutes.”

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.

Author
Topics