Tag: Lighthill Report

  • AI Winters: Why Did the Enthusiasm Collapse Twice?

    AI Winters: Why Did the Enthusiasm Collapse Twice?

    1. What Is an ‘AI Winter’?

    An artificial intelligence winter is a period when interest and funding decline. It follows phases of high expectations, only to realize that promises are not being fulfilled (ISA, 2024; Maduranga, 2024). During these cycles, projects lose public and private support. As a result, the field enters a pause and is sometimes redefined.

    Imaginative representation of an AI Winter
    Imaginative representation of an AI Winter

    2. First Winter (1974–1980)

    2.1 Expectations vs. Reality

    After superficial advances in the 1960s, high expectations emerged. However, machine translation failed to deliver useful results. Major efforts like the ALPAC report led to reduced funding, especially in the U.S., which dampened enthusiasm (Cdteliot, 2024).

    2.2 Criticism and Cutbacks

    In 1973, the British government commissioned the Lighthill Report, which questioned the viability of AI. Its publication led to funding cuts in UK universities. At the same time, DARPA halted many projects, citing a lack of significant progress (Cdteliot, 2024).

    2.3 Academic Consequences

    The result was an exodus of researchers. Many turned to more stable disciplines. The environment became conservative; research continued in small circles, and AI lost its public visibility (Smartnetacademy, 2020).

    Imaginative representation of the failure of machine translation: AI translates the word "love" in Chinese as "chicken"
    Imaginative representation of the failure of machine translation: AI translates the word “love” in Chinese as “chicken”

    Brief Spring (1980s) and Second Winter (1987 to 1993–94)

    3.1 Rise of Expert Systems

    In the 1980s, hope reemerged with expert systems. These systems mimicked specialized decision-making, such as in medicine or finance. Companies began funding these projects, sparking a new wave of investment (Krdzic, 2024). This led to a brief spring.

    3.2 Failure of Specialized Hardware and Limitations of Expert Systems

    In 1987, the LISP machine industry collapsed. Personal computers were already capable of running the software without expensive equipment. As a result, much of the sector crumbled within months, triggering another drastic funding cut (Wikipedia Eng, 2025).

    Regarding expert systems, their limitations began to show in the early 1990s. Although some systems proved successful, they were also very expensive to maintain. Updating knowledge bases wasn’t easy either. Additionally, they struggled to produce coherent outputs when given unusual inputs (ISA, 2024; Wikipedia Eng, 2025).

    3.3 Failed Projects

    Japan launched its ambitious “Fifth Generation” project with substantial funding, but it failed to meet expectations. DARPA initiatives like the Strategic Computing Initiative also faced budget cuts due to modest results (Wikipedia Eng, 2025).

    Historic Photograph of a LISP Machine
    Historic Photograph of a LISP Machine

    4. Common Causes of the Slowdown

    • Excessive expectations. The goals were so ambitious that actual progress seemed insignificant compared to the initial hype.
    • Technical limitations. Available hardware and data were insufficient to support complex models.
    • Commercial disconnect. The tools offered were too expensive and failed to deliver the expected return on investment.
    Old low-capacity computers
    Old low-capacity computers

    5. Lessons and the Resilience Cycle

    5.1 Learning from Within

    Each AI winter led to methodological adjustments. Practical research was prioritized, and scientific rigor was reinforced. Many breakthroughs emerged precisely during less glamorous times.

    5.2 Importance of Hardware and Data

    The AI resurgence in the 2000s coincided with advancements in hardware (GPUs) and the massive availability of data. These factors enabled the development of efficient algorithms that powered deep learning.

    5.3 Managing Expectations

    Today, it is recognized that slowdowns are natural. Managing the hype allows progress to continue steadily. Modern AI is more responsible and less impulsive (Urban, 2025).

    GPU GEFORCE FX 5900
    GPU GEFORCE FX 5900

    6. Are There Signs of a New AI Winter?

    Experts warn of a possible future cooldown. Although the current boom is strong, it faces challenges (Zara, 2024; Zulhusni, 2024)

    • Risk of overpromising in generative AI.
    • Regulations around privacy and ethics that could slow expansion.
    • Unrealistic expectations for applications like autonomous vehicles.
    Spring or Winter Ahead?
    Spring or Winter Ahead?

    7. Conclusion

    The AI winters were periods of correction following overblown expectations. And although they temporarily halted progress, they also helped consolidate methods, approaches, and technologies. In other words, they offered a chance to recalibrate the field toward more realistic goals.

    Learning from these episodes is key to preventing AI from cooling again. The current boom is built on stronger foundations, but caution remains essential—mainly because there’s always a risk of setting goals that are too ambitious to meet.

    However, as long as progress continues steadily toward achievable objectives and as long as existing methods and technologies are solidified, the chances of a new AI winter diminish significantly.

    Winter Landscape
    Winter Landscape

    Share this article if you were surprised by how many times AI has reinvented itself.