The recent advancements in the field of edge and cloud computing for IoT applications have seen a significant shift towards more adaptive and efficient scheduling techniques. Researchers are increasingly leveraging Transformer-enhanced Deep Reinforcement Learning (DRL) to address the complexities of scheduling in dynamic environments. These methods aim to reduce response times, energy consumption, and costs while improving overall system efficiency. Additionally, there is a growing focus on integrating green energy sources, such as solar power, with vehicular edge computing systems to optimize task scheduling and maximize revenue. This trend not only enhances sustainability but also ensures continuous operation during periods of low energy availability. Furthermore, there is a noticeable simplification in the design of embedded systems for amateur applications, such as lightweight solutions tailored for student racing competitions. These developments collectively push the boundaries of what is achievable in resource-constrained and dynamic computing environments.
Noteworthy papers include one that introduces a Transformer-enhanced Distributed DRL technique, significantly reducing various costs by up to 60%. Another notable contribution is a neural network-enhanced column generation approach for parallel machine scheduling, achieving substantial computational time savings and scalability.