Recently, much has been written about the ChatGPT program from OpenAI’s incredible potential for producing natural-language utterances in response to human queries.
Many believe ChatGPT must be unique in the universe because it is so innovative and captivating.
AI experts disagree strongly.
ChatGPT is not very unique in terms of fundamental techniques, In a small press and executive gathering, Yann LeCun, Meta’s chief AI scientist, made the following statement.
Despite how the public perceives it, LeCun stated, It’s nothing groundbreaking. It’s just, really wonderfully put together and executed.
According to LeCun, other businesses and research facilities have constructed similar data-driven AI systems in the past. He said that it is false to assume that OpenAI is the only organization doing its kind of work.
OpenAI is not significantly more advanced than the other labs, LeCun remarked.
LeCun continued, It’s not just Google and Meta; there are a half-dozen firms that essentially have very similar technology to it. Although it’s not exactly rocket science, there isn’t really a mystery to it.
LeCun pointed out the various ways in which ChatGPT and the program on which it is based, OpenAI’s GPT-3, are made up of several pieces of technology that have been created over a long period of time by numerous parties.
You must understand that ChatGPT makes use of Transformer designs that have already undergone this type of self-supervised training, said LeCun. He has been advocating self-supervised learning for a long time, even before OpenAI existed, he added.
Transformers was created by Google, claimed LeCun, referring to the language neural network that the company released in 2017 and served as the foundation for a wide range of language programs, including GPT-3.
According to LeCun, such language programs have been in development for decades.
LeCun referred to the director of Canada’s MILA institute for AI as Yoshua Bengio when he said, “Large language models, the first neural net language model — at the time, it was large, by today’s standards, it’s little.” All language models now heavily rely on Bengio’s work on the concept of attention, which was eventually adopted by Google for the Transformer.
Furthermore, the program developed by OpenAI makes considerable use of a method known as reinforcement learning through human input, which enlists human agents to rank the output of the machine in order to improve it, similar to Google’s Page Rank for the web. According to him, Google’s DeepMind division, not OpenAI, was the innovator of such a strategy.
LeCun, referring to ChatGPT, stated that there was a long history behind the technology.
LeCun claimed that the ChatGPT program is more of an example of good engineering than it is a case of scientific achievements. He compared the software to the IBM Watson computer, which took part in the 2011 Jeopardy! competition, and the self-driving car developed by businessman Sebastian Thrun, which won the 2005 Grand Challenge sponsored by DARPA. In terms of the underlying science, Thrun’s prize-winning technology “wasn’t really new,” according to LeCun, it was just extremely well engineered.
OpenAI has essentially done that, he stated. He is not going to fault them for that.
LeCun responded to a query concerning OpenAI presented by reporter Cade Metz during the conference. Metz questioned if LeCun’s creation, Meta’s AI team FAIR, would ever become as well-known in the public eye for its innovations as OpenAI is.
Will Meta show us something like this? Yes, this is what we’ll see, LeCun responded. He said, not just text generation, but also production aids, like generative art, which he think is going to be a big deal.
By automatically creating media that promotes a brand, Meta will be able to assist small enterprises in self-promotion.
LeCun said, There are about 12 million businesses that advertise on Facebook, and most of them are mom and pop stores, and they just don’t have the resources to produce a new, properly designed ad. Generative art could therefore be very beneficial to them.
LeCun also made reference to ChatGPT when he said, “You might pose the question, Why aren’t there analogous systems from, say, Google and Meta,” at another point in the discussion.
With a chuckle, LeCun responded, “And the explanation is that Google and Meta both have a lot to lose by putting out systems that make stuff up.”
LeCun wins the 2019 Turing Award, which is the equivalent of the Nobel Prize in computing, with MILA’s Bengio and University of Toronto professor and Google Fellow Geoffrey Hinton. The three contributed to the development of the deep learning age of AI.
The startup OpenAI is backed by Microsoft, which also has exclusive access to the startup’s source code. Microsoft is gradually integrating the programs into its numerous software products, including its Azure cloud service.