• News
  • Business
  • Entertainment
  • Science & Health
  • Technology
Facebook Twitter Instagram
  • Contact Us
  • Write for Us
  • About Us
  • Privacy Policy
Subscribe
ZXQ
  • News
  • Business
  • Entertainment
  • Science & Health
  • Technology
Facebook Twitter Instagram
ZXQ
ZXQ » News » Technology » Jonathan De Vita: Why Is Memory More Powerful Than Time for Algorithms?
Technology

Jonathan De Vita: Why Is Memory More Powerful Than Time for Algorithms?

By Andreas McGowanDecember 4, 2025Updated:December 4, 20254 Mins Read
Facebook Twitter Pinterest LinkedIn Tumblr Email
Abstract representation of memory and time concepts in algorithmic technology or AI systems
Share
Facebook Twitter LinkedIn Pinterest Email

Jonathan De Vita is a computer scientist who specialised in AI and coding as part of his studies at Lancaster University. This article will look at algorithms and a recent study which suggests that a little memory can outweigh a vast amount of time.

An algorithm is a system designed to solve a problem or perform a computation. Essentially, algorithms are a precise set of instructions to conduct specified step-by-step actions either through hardware- or software-based routines.

In computer science, computer programming and mathematics, the term ‘algorithm’ is generally used to describe a process that solves a recurrent problem. Algorithms are widely used across all areas of IT today, including playing a major role in automated systems and as specifications for performing data programming.

Typically starting with initial input and instructions describing a specific computation, algorithms are used for a variety of different purposes, from sorting datasets to more complicated endeavours such as recommending user content on social media. They work by following a set of rules or instructions to solve a problem or complete a task. Algorithms may be expressed as programming languages, natural languages, flowcharts, control tables or pseudocode.

With algorithms, and computation as a whole, time and memory are the two most fundamental resources. Every algorithm requires time to run along with memory, also known as space, to store data while the algorithm is running. Until very recently, it was believed that algorithms for completing certain tasks required an amount of space proportional to their runtime. However, a recent study has turned this widely held belief on its head, with research from MIT suggesting that it could be possible to transform any algorithm, irrespective of its purpose, into a form that demands much less space.

In 2024, computer scientist Ryan Williams made a startling discovery about the relationship between memory and time in computing. His research provided compelling evidence that memory was much more powerful than he and his peers had believed up until that point, with his studies suggesting that a small amount of memory was as helpful as a lot of time in all conceivable computations. Since the notion seemed improbable, Williams assumed he had made a mistake, overhauling his research to try to identify any errors. However, despite close scrutiny and hours of poring over his results, he could not find a single flaw.

Ryan Williams, who serves as a theoretical computer scientist at MIT, explained in a Wired interview that ‘I just thought I was losing my mind.’ However, presented with such compelling evidence, he began to wonder for the first time whether memory really might be infinitely more powerful than previously assumed.

Over the ensuing months, Ryan Williams fleshed out the details, scrutinising every component and soliciting feedback from trusted contemporaries. In February 2025, he published his paper online to widespread acclaim. Avi Wigderton, who serves as a theoretical computer scientist at the Institute for Advanced Study in Princetown, New Jersey, described Williams’ research as ‘amazing’ and ‘beautiful’ in a congratulatory email headed with the subject line ‘You blew my mind.’

Algorithm designers have long studied space-time trade-offs for specific tasks like sorting data. Even saving a small amount of space would be seen as a huge leap forward. Ryan Williams’ research centred around the concept of solving problems with extremely limited space, drawing on the pioneering work of complexity theorist Stephen Cook and his tree evaluation problem. Ryan Williams’ research has established a quantitative gap between the power of memory and the power of time, although, as Williams would be the first to admit, much work remains to be done in terms of defining their precise relationship.

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
Andreas McGowan

Andreas McGowan is a tech news writer at ZXQ. He has been interviewed about his opinions on technology and the way it interacts with life as we know it, as well as how he approaches producing news articles for ZXQ.

Related Posts

WHY MATHEMATICS STILL MATTERS IN THE AGE OF AI: THE HUMAN LOGIC BEHIND INTELLIGENT SYSTEMS

Upselling: One of the Downsides to Piecemeal Home Security

Why Scaling Engineering Teams Has Become So Hard

Subscribe to Updates

Get the latest news directly to your inbox.

Follow us on Google News
Latest Posts
WHY MATHEMATICS STILL MATTERS IN THE AGE OF AI: THE HUMAN LOGIC BEHIND INTELLIGENT SYSTEMS
January 13, 2026
How Physical Fitness Improves Women’s Mental Health Amid Busy Schedules
January 12, 2026
Upselling: One of the Downsides to Piecemeal Home Security
January 9, 2026
David Rook: Estate Sales Tips for Beginners
January 7, 2026
Afrimex: How Is GoldBod Transforming the Landscape of Gold Mining in Ghana?
January 6, 2026
4 Ways to Stay Warm in Your Condo Work Nook
December 26, 2025
A Complete Walkthrough for the Funinjeet Live Score Entertainment Platform
December 25, 2025
Why Scaling Engineering Teams Has Become So Hard
December 21, 2025
ZXQ
Facebook Twitter Instagram Pinterest Vimeo YouTube
  • Contact Us
  • Write for Us
  • About Us
  • Privacy Policy
ZXQ © 2026

Type above and press Enter to search. Press Esc to cancel.