• News
  • Business
  • Entertainment
  • Science & Health
  • Technology
Facebook Twitter Instagram
  • Contact Us
  • Write for Us
  • About Us
  • Privacy Policy
Subscribe
ZXQ
  • News
  • Business
  • Entertainment
  • Science & Health
  • Technology
Facebook Twitter Instagram
ZXQ
ZXQ » News » Technology » Jonathan De Vita: Why Is Memory More Powerful Than Time for Algorithms?
Technology

Jonathan De Vita: Why Is Memory More Powerful Than Time for Algorithms?

By Andreas McGowanDecember 4, 2025Updated:December 4, 20254 Mins Read
Facebook Twitter Pinterest LinkedIn Tumblr Email
Abstract representation of memory and time concepts in algorithmic technology or AI systems
Share
Facebook Twitter LinkedIn Pinterest Email

Jonathan De Vita is a computer scientist who specialised in AI and coding as part of his studies at Lancaster University. This article will look at algorithms and a recent study which suggests that a little memory can outweigh a vast amount of time.

An algorithm is a system designed to solve a problem or perform a computation. Essentially, algorithms are a precise set of instructions to conduct specified step-by-step actions either through hardware- or software-based routines.

In computer science, computer programming and mathematics, the term ‘algorithm’ is generally used to describe a process that solves a recurrent problem. Algorithms are widely used across all areas of IT today, including playing a major role in automated systems and as specifications for performing data programming.

Typically starting with initial input and instructions describing a specific computation, algorithms are used for a variety of different purposes, from sorting datasets to more complicated endeavours such as recommending user content on social media. They work by following a set of rules or instructions to solve a problem or complete a task. Algorithms may be expressed as programming languages, natural languages, flowcharts, control tables or pseudocode.

With algorithms, and computation as a whole, time and memory are the two most fundamental resources. Every algorithm requires time to run along with memory, also known as space, to store data while the algorithm is running. Until very recently, it was believed that algorithms for completing certain tasks required an amount of space proportional to their runtime. However, a recent study has turned this widely held belief on its head, with research from MIT suggesting that it could be possible to transform any algorithm, irrespective of its purpose, into a form that demands much less space.

In 2024, computer scientist Ryan Williams made a startling discovery about the relationship between memory and time in computing. His research provided compelling evidence that memory was much more powerful than he and his peers had believed up until that point, with his studies suggesting that a small amount of memory was as helpful as a lot of time in all conceivable computations. Since the notion seemed improbable, Williams assumed he had made a mistake, overhauling his research to try to identify any errors. However, despite close scrutiny and hours of poring over his results, he could not find a single flaw.

Ryan Williams, who serves as a theoretical computer scientist at MIT, explained in a Wired interview that ‘I just thought I was losing my mind.’ However, presented with such compelling evidence, he began to wonder for the first time whether memory really might be infinitely more powerful than previously assumed.

Over the ensuing months, Ryan Williams fleshed out the details, scrutinising every component and soliciting feedback from trusted contemporaries. In February 2025, he published his paper online to widespread acclaim. Avi Wigderton, who serves as a theoretical computer scientist at the Institute for Advanced Study in Princetown, New Jersey, described Williams’ research as ‘amazing’ and ‘beautiful’ in a congratulatory email headed with the subject line ‘You blew my mind.’

Algorithm designers have long studied space-time trade-offs for specific tasks like sorting data. Even saving a small amount of space would be seen as a huge leap forward. Ryan Williams’ research centred around the concept of solving problems with extremely limited space, drawing on the pioneering work of complexity theorist Stephen Cook and his tree evaluation problem. Ryan Williams’ research has established a quantitative gap between the power of memory and the power of time, although, as Williams would be the first to admit, much work remains to be done in terms of defining their precise relationship.

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
Andreas McGowan

Andreas McGowan is a tech news writer at ZXQ. He has been interviewed about his opinions on technology and the way it interacts with life as we know it, as well as how he approaches producing news articles for ZXQ.

Related Posts

What is iOS app security? A complete guide for developers

The growing importance of front-end developers in a remote world

5 Gadgets to Help You Get Better Zzz’s

Subscribe to Updates

Get the latest news directly to your inbox.

Follow us on Google News
Latest Posts
Jonathan De Vita: Why Is Memory More Powerful Than Time for Algorithms?
December 4, 2025
Why River Modern is a Smart Property Investment
December 3, 2025
4 Great Games to Play on the PS5
November 27, 2025
Can You Recommend Some Reliable IT Services Providers?
November 24, 2025
Smart Time Management for a Well-Balanced College Life
November 16, 2025
Unlock business growth: the advantages of comprehensive machinery insurance
November 12, 2025
Technology and Tranquility — Smart Homes in the Age of Modern Luxury
November 10, 2025
What is iOS app security? A complete guide for developers
November 6, 2025
ZXQ
Facebook Twitter Instagram Pinterest Vimeo YouTube
  • Contact Us
  • Write for Us
  • About Us
  • Privacy Policy
ZXQ © 2025

Type above and press Enter to search. Press Esc to cancel.