The field of hardware design automation is witnessing a significant shift towards leveraging large language models (LLMs) and innovative methodologies to improve design efficiency, accuracy, and adaptability. Recent developments have focused on addressing the challenges of automated hardware design, including hallucinations in LLM-generated code and the need for efficient design space exploration. Novel frameworks and techniques, such as agentic LLMs, prompt engineering, and retrieval-augmented generation, are being explored to enhance the capabilities of LLMs in hardware design. Additionally, there is a growing emphasis on developing infrastructure for multi-level hardware design and power, performance, and area (PPA) estimation. Noteworthy papers include:
- VeriMind, which proposes an agentic LLM framework for automated Verilog generation with a novel evaluation metric, achieving up to 8.3% improvement on pass@k metric and 8.1% on pass@ARC metric.
- HDLCoRe, which presents a training-free framework for mitigating hallucinations in LLM-generated HDL, resulting in superior performance on the RTLLM2.0 benchmark and significantly reducing hallucinations.
- MLDSE, which introduces a novel infrastructure for domain-specific design space exploration of multi-level hardware, enabling three-tier DSE spanning architecture, hardware parameter, and mapping.
- RocketPPA, which develops an ultra-fast LLM-based PPA estimator at code-level abstraction, achieving significant improvements in power, delay, and area estimation accuracy.