DeepMind AI creates quicker algorithms to solve complex mathematical puzzles. Since the beginning of time, algorithms have helped mathematicians do basic tasks. For example, ancient Egyptians came up with a way to multiply two numbers without using a table, and Euclid, a Greek mathematician, wrote about a way to find the greatest common divisor that is still used today.
In their paper, which came out today in Nature, they describe AlphaTensor, the first artificial intelligence (AI) system that can find new, efficient, and provably correct ways to do basic tasks like multiplying matrices. It helps answer a 50-year-old math question about how to multiply two matrices as quickly as possible.
This paper is a step toward DeepMind's goal of using AI to improve science and solve the most fundamental problems. AlphaTensor is based on AlphaZero, an agent that has outperformed humans at board games like chess, go, and shogi. This paper shows how AlphaZero went from playing games to solving math problems for the first time.
This paper is a step toward DeepMind's goal of using AI to improve science and solve the most fundamental problems. AlphaTensor is based on AlphaZero, an agent that has outperformed humans at board games like chess, go, and shogi. This paper shows how AlphaZero went from playing games to solving math problems for the first time.
Matrix multiplication
Matrix multiplication is one of the easiest things to do in algebra, and most high school math classes teach it. But outside of the classroom, this simple math operation significantly impacts today's digital world and is used everywhere in modern computing.
This operation is used to process images on smartphones, recognize speech commands, make graphics for computer games, run simulations to predict the weather, compress data and videos to share on the internet, and so much more. In addition, companies worldwide spend a lot of time and money making computer hardware that can multiply matrices quickly and satisfactorily. So, even small changes that improve matrix multiplication can have a significant effect.
Mathematicians long thought the standard matrix multiplication algorithm was the most efficient way to do things. But in 1969, a German mathematician named Volker Strassen shocked the math world by showing that there are better algorithms.
In their paper, the researchers looked at how new AI techniques could make it easier for computers to find new ways to multiply matrices. AlphaTensor found algorithms that work better than the current state of the art for many different sizes of matrices. These algorithms were made possible by the progress of human intuition. Furthermore, their algorithms work better than those made by humans, which is a big step forward in algorithmic discovery.
Conclusion
AlphaTensor is trained from scratch to find matrix multiplication algorithms that are better than those made by humans or computers. Even though AlphaTensor is better than known algorithms, the researchers say that one drawback is that you must define a set of possible factor entries F ahead of time. It limits the search space, but it could mean you miss out on efficient algorithms.
Changing AlphaTensor to look for F could be an exciting direction for future research. AlphaTensor's ability to support complex stochastic and non-differentiable rewards (from the tensor rank to practical efficiency on specific hardware) and find algorithms for custom operations in a wide range of spaces is one of its most essential strengths (such as finite fields). In addition, the researchers think this will make it easier for people to use AlphaTensor to create algorithms that optimize metrics.
The researchers also say that we can use their method to solve related simple math problems, like figuring out other ways to measure rank and NP-hard matrix factorization problems. By using DRL to solve a core NP-hard computational problem in mathematics (the computation of tensor ranks), AlphaTensor shows that DRL can be used to solve complex mathematical problems and could help mathematicians find new things.
4th Edition of International Conference on Mathematics and Optimization Methods
Website Link:https://maths-conferences.sciencefather.com/
Award Nomination: https://x-i.me/XU6E
Instagram: https://www.instagram.com/maths98574/
Twitter: https://twitter.com/AnisaAn63544725
Pinterest: https://in.pinterest.com/maxconference20022/
#maths #numericals #algebra #analysis #analysis #mathmatics #numericals #number #complex #graphics #graphs
Website Link:https://maths-conferences.sciencefather.com/
Award Nomination: https://x-i.me/XU6E
Instagram: https://www.instagram.com/maths98574/
Twitter: https://twitter.com/AnisaAn63544725
Pinterest: https://in.pinterest.com/maxconference20022/
#maths #numericals #algebra #analysis #analysis #mathmatics #numericals #number #complex #graphics #graphs
No comments:
Post a Comment