Framework consolidation: Is it time for deep learning frameworks to merge?

IMAGE CREDIT:
Image credit
iStock

Framework consolidation: Is it time for deep learning frameworks to merge?

Framework consolidation: Is it time for deep learning frameworks to merge?

Subheading text
Big tech firms have touted their proprietary artificial intelligence frameworks at the cost of better collaboration.
    • Author:
    • Author name
      Quantumrun Foresight
    • January 31, 2023

    Tools that utilize artificial intelligence (AI) and machine learning (ML) make it possible for organizations to manage and analyze their ever-growing treasure troves of data more effectively. In particular, deep learning (DL) frameworks are becoming the building blocks of many AI/ML innovations. The challenge now lies in consolidating different frameworks to fast-track research and development.

    Framework consolidation context

    A programming framework is a set of tools that helps developers build software and systems that are well-organized and reliable. A framework in programming provides ready-made components or solutions to common problems, which developers can then customize according to their specific needs. In traditional programming, custom code calls into the library to access reusable code. With inversion of control (IoC), the framework calls on custom pieces of code when necessary.

    When it comes to DL, frameworks provide an easy way to design, train, and validate deep neural networks. Many DL frameworks use graphics processing units (GPUs) to accelerate training, including PyTorch, TensorFlow, PyTorch Geometric, and DGL. These frameworks rely on GPU-accelerated libraries like cuDNN, NCCL, and DALI to deliver high performance. 

    The popularity of certain DL frameworks among researchers often reflects the trends in commercial applications. For example, Google's TensorFlow and Meta's PyTorch are two of the most popular. In particular, PyTorch has seen an uptick in adoption since 2017. According to AI-focused magazine The Gradient, in 2019 conference papers that mentioned the framework used, 75 percent cited PyTorch but not TensorFlow. Out of 161 researchers who published more TensorFlow papers than PyTorch papers, 55 percent switched to Pytorch, while only 15 percent did the opposite.

    Disruptive impact

    There is an increasing need for companies to consolidate their AI frameworks to provide consistent results and quality control. The research-to-production pipeline of AI projects has been known to be slow and tedious in the past. Multiple steps, tools that are difficult to use, and a lack of standardization made it hard to keep track of everything. Researchers and engineers had trouble choosing between frameworks that were either good for research or commercial production, but not both.

    In 2021, Meta decided to migrate all its AI systems to PyTorch. Previously, the company used two major frameworks—open-source PyTorch for research (which the company developed in partnership with Linux Foundation) and Caffe2, the in-house framework used for commercial purposes. This transition is good news not just for Meta, which will save money on maintenance and development, but also for developers using this open-source framework. Meta said it would focus on working with the PyTorch developer community, collaborating on ideas and potential projects. 

    PyTorch engineers at Facebook have gradually introduced various tools, pre-trained models, libraries, and datasets essential for each stage in developing AI/ML innovations. With the 2021 updates, there have been more than 3,000 ongoing research compared with the previous versions. Hopefully, tech companies will collaborate more to trim down AI frameworks and create interoperable systems that promote collaboration and adoption.

    Implications of framework consolidation

    Wider implications of framework consolidation may include: 

    • Faster innovation in the AI/ML space as more companies adopt one main framework for research.
    • Consistent end-user experience across different software that uses the same underlying infrastructure, particularly for smart home and Internet of Things (IoT) devices.
    • Researchers being able to accurately identify algorithm bias and other bugs/issues when using one common framework.
    • More collaboration between tech firms and organizations to create open-source frameworks that anyone can access and build upon.
    • Increasing competition between big tech firms to establish the most dominant framework, which can hamper collaboration.

    Questions to comment on

    • If you work in the DL space, how has consolidating frameworks made your job easier?
    • What are the other benefits of a select number of frameworks that work well together?

    Insight references

    The following popular and institutional links were referenced for this insight: