Phone

Email

info@heuristics.com.au

+61 2 xxxx xxxxx

Heuristics Australia - Intelligence By Design

How to Enhance Power, Compute Efficiency and Mitigate Hallucinations in Generative AI

John Ypsilantis

Generative AI has become established and ubiquitous in the past few years. Growing demand for generative AI services is driving the rapid construction of datacentres housing immense computational capacity. The resulting power and cooling demands pose significant challenges regarding the construction, scaling and operation of AI datacentres.

Alternative, conventional machine learning algorithms that focus on sequence can improve the efficiency and sustainability of Generative AI. Such algorithms the potential to reduce the physical compute requirements significantly, with concomitant reductions in the power and cooling requirements for datacentres.

Essentially, Generative AI is a hardware and software system that creates sequences of tokens. These sequences of tokens (subject to the training sets that are used to train the underlying LLM for the Generative AI) may be interpreted in several ways to generate content. For example:

  • Sequences of pixels making up an image or sequence of images (animations/video)
  • Sequences of musical notes or sounds, resulting in musical scores or music itself, or
  • Sequences of words which ultimately make up documents

Very large, hierarchical and deep artificial neural networks (ANNs) underpin Generative AI and LLMs as the enabling technology.

ANNs’ strength is the recognition of patterns. They cannot intrinsically generate sequences without massive scale and recurrence (feeding outputs back to the inputs).

There are classes of machine learning that intrinsically manage sequences, including

  • Hidden Markov Models (HMMs) and
  • Sequence Graphs (SGs)

Both of these algorithms can ingest training sets that comprise sequences of tokens, and learn the characteristics of these sequences. In turn, they can generate sequences of tokens based on their prior training.

HMMs are often used in natural language processing. SGs are commonly used in the management and shelving of alarms in SCADA environments. However, both can elicit sequence characteristics efficiently, and in turn generate candidate token sequences.

Another important feature of HMMs and SGs is their ability to quantify confidence in the output that they produce, by virtue of the algorithms’ intrinsic characteristics. ANNs on the other hand, cannot provide a confidence metric for their output. This is the reason why Generative AI is prone to hallucination – if a generative AI is asked to assess confidence in its output, the answer is almost always greatly optimistic. Even when it hallucinates or produces completely erroneous results.

Finally, both HMMs and SGs are intrinsically unsupervised, but performance can be enhanced if supervised learning is conducted. This is a core requirement for machine learning in LLM and Generative AI contexts.

In comparing the two algorithms, SGs are a better choice. SGs do not require globally computed metrics, unlike HMMs which require the application of the Baum-Welch algorithm applied across the entire data structure to induce hidden models. SGs intrinsically capture multiple sequences with associated confidence metrics, requiring only local transactions and computations, and so learn sequence information extremely efficiently.

In summary, specific advantages in using conventional sequence-specific machine include:

  • The ability to assess confidence of output 
  • The ability to work equally well on vector and standard HW
  • Compared to ANNs, they result in efficient, smaller data structures that describe learned sequences
  • They require much less memory and compute horsepower than very large ANNs, and in turn much lower power and cooling requirements, and significantly lower construct and operate costs for datacentres

In conclusion, conventional machine learning can be a viable alternative base technology for Generative AI, and which can address several of the key issues and constraints that characterise ANN-based Generative AI.


Intelligence By Design

© Copyright 2026 Heuristics Australia Pty Ltd. ABN 53 079 977 076