We are excited to bring Transform 2022 back in-person July 19 and virtually July 20 - 28. Join AI and data leaders for insightful talks and exciting networking opportunities. Register today!


In what might only be perceived as a win for Facebook, OpenAI today announced that it will migrate to the social network’s PyTorch machine learning framework in future projects, eschewing Google’s long-in-the-tooth TensorFlow platform. OpenAI is the San Francisco-based AI research firm cofounded by CTO Greg Brockman, chief scientist Ilya Sutskever, Elon Musk, and others, with backing from luminaries like LinkedIn cofounder Reid Hoffman and former Y Combinator president Sam Altman. In a blog post, the company cited PyTorch’s efficiency, scalability, and adoption as the reasons for its decision.

“Going forward we’ll primarily use PyTorch as our deep learning framework but sometimes use other ones when there’s a specific technical reason to do so,” said the company in a statement. “We’re … excited to be joining a rapidly-growing developer community, including organizations like Facebook and Microsoft, in pushing scale and performance on [graphics cards].”

OpenAI says that many of its teams have already migrated their work to PyTorch and that they’ll contribute to the PyTorch community in the coming months. Additionally, the company says it plans to make available its Spinning Up in Deep RL educational resource on PyTorch in early 2020, after which point it intends to investigate scaling AI systems with data parallel training, visualizing those systems with model interpretability, and building general-purpose robotics frameworks. (OpenAI is in the process of writing PyTorch bindings for its highly optimized blocksparse kernels, and it says it’ll open-source those bindings in the coming months.)

PyTorch, which Facebook publicly released in October 2016, is an open source machine learning library based on Torch, a scientific computing framework and script language that’s in turn based on the Lua programming language. As of March 2018, it incorporates Caffe2, a deep learning toolset pioneered by University of California, Berkeley researchers and further developed by Facebook’s AI Research lab.

Event

Transform 2022

Join us at the leading event on applied AI for enterprise business and technology decision makers in-person July 19 and virtually from July 20-28.

Register Here

While TensorFlow has been around slightly longer (since November 2015), PyTorch continues to see rapid uptake in the data science and developer community. It claimed one of the top spots for fastest-growing open source projects in the past 12 months, according to GitHub’s 2018 Octoverse report. Facebook recently revealed that in 2019 the number of contributors to the platform grew more than 50% year-over-year to nearly 1,200. An analysis conducted by The Gradient found that every major AI conference in 2019 has had a majority of papers implemented in PyTorch, and O’Reilly noted that PyTorch citations in papers grew by more than 194% in the first half of 2019 alone.

Unsurprisingly, a number of leading machine learning software projects are built on top of PyTorch, including Uber’s Pyro and HuggingFace’s Transformers. Software developer Preferred Networks joined the ranks recently with a pledge to move from its bespoke AI development framework, Chainer, to PyTorch in the near future.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.

Author
Topics