How artificial intelligence is redefining computers

  Fall 2021 is the season for pumpkins, walnut pie, and new phones. At this time of year, Apple, Samsung, Google and other manufacturers release new models on schedule. In the early years, this series of launches was always a surprise, but as they became a fixture in the consumer tech agenda, that effect has faded away. However, behind the various marketing methods, there are still some things that deserve our attention.
  Google’s latest Pixel 6 is the first phone to feature a separate AI-specific chip in addition to the main processor. The chips used in the iPhones released over the past two years have what Apple calls a “neural engine” built in for artificial intelligence. Both chips are better suited for the computations that are performed on the devices in our hands to train and run machine learning models, such as artificial intelligence that enhances camera performance. AI has become a part of our daily lives without us even realizing it.
  what does this mean? How to put it, in the forty or fifty years before that, computers have basically not changed much. They’ve gotten smaller and faster, but they’ve always been little boxes of processors that execute human instructions. AI has changed this in at least three ways: how it is made, how it is programmed, and how it is used. One day, the meaning of computer existence will also change.
  ”The core of computing is shifting from crunching numbers to making decisions,” said Pradeep Dolby, director of Intel’s Parallel Computing Lab. Or, according to Daniela Ross, director of the MIT Computer and Artificial Intelligence Laboratory, AI is freeing computers from their little boxes.
Haste is not enough

  The first change concerns the way computers and their control chips are made. Advances in computer technology have traditionally benefited from the decreasing time required to complete each computation. For decades, the iterations of Moore’s Law in chip manufacturing have been cyclical, and the world has benefited from it.
  But now, the deep learning models that power AI applications need to do it the other way around: They require a lot of simultaneous, less precise computations. That calls for a new kind of chip: one that moves data as fast as possible, making sure it’s available when and where it’s needed. When deep learning ushered in its spring about 10 years ago, there were already computer chips that were very good at this type of work: graphics cards, or GPUs, which could display an entire screen at a rate of dozens of times per second. of pixels.
  At the moment, former graphics card manufacturers like Intel, ARM, and Nvidia are testing the waters for AI-specific hardware. Google and Facebook are also making strong forays into the industry, and a race for artificial intelligence dominance in hardware has begun.
  For example, Google’s Pixel 6 has a new mobile version of the TPU. Unlike traditional chips, which strive for ultra-fast precision operations, TPUs are designed for the bulk, low-precision operations required by neural networks. Google has used such chips internally since 2015: they process images and natural language search tasks. DeepMind also uses TPUs to train its artificial intelligence.
  Over the past two years, Google has provided TPUs to other companies, and these chips and similar products developed by other companies are becoming standard in data centers around the world.
  AI is even helping involve its own computing infrastructure. In 2020, Google designed the layout of the new TPU with reinforcement learning algorithms. This artificial intelligence finally came up with a strange design, it is impossible for humans to design such a thing, but this solution is easy to use. This type of AI may later lead to better, more efficient chips.
It’s a mule or a horse, pull it out for a walk

  The second change concerns the way the computer receives instructions. “For the past 40 years, we’ve been programming computers; for the next 40 years, we’ll be training them,” said Bishop, Microsoft’s UK research director.
  In the past, in order for a computer to do things like recognize speech or recognize objects in a photo, programmers first had to make rules for the computer.
  But with machine learning, programmers no longer have to make rules. Instead, they create neural networks that learn these rules on their own. It’s a fundamentally different way of thinking.
  Examples of this are everywhere: voice and image recognition are now standard on smartphones. Others made headlines, such as AlphaZero teaching itself to play Go better than humans. A similar example is that AlphaFold has solved a biological problem that humans have been unable to solve for decades – it has figured out the patterns of protein folding.
  For Bishop, the next big breakthrough will be molecular simulation: training computers to manipulate the properties of matter, which could lead to world-changing leaps in energy use, food production, manufacturing and medicine .
computer knows best

  For decades, getting a computer to do a task required typing instructions, or at least pressing a button.
  But now, machines can interact with humans without keyboards or screens. Anything can be turned into a computer. In fact, most household items including toothbrushes, switches and doorbells have launched smart models. But as these kinds of products became more abundant, we started to make it easier to control them. They should be able to understand our needs on their own.
  In Dolby’s eyes, this shift from crunching numbers to making decisions defines a new era of computing.
  Ross wants our support for the cognitive and practical aspects of artificial intelligence. She envisions computers that can give us information when we need it and reach out when we need help. Ross said: “My favorite movie clip when I was a kid was Disney’s ‘The Sorcerer’s Apprentice.’ You know, Mickey Mouse, he summoned a broom to help him clean, and he used magic, but now we can do it without magic. to this.”

Share