The recent advancements in the field of Knowledge Graphs (KGs) and their embeddings have significantly enhanced our understanding and capabilities in graph representation learning. A notable trend is the exploration of how Knowledge Graph Embedding Models (KGEMs) interact with the structure of KGs, addressing issues such as bias and performance determinants. This has led to innovative approaches like the Hyperbolic Hypergraph Neural Network (H2GNN), which leverages hyperbolic space to better model hierarchical data and complex relations within hypergraphs. Additionally, there has been a focus on improving link prediction tasks, particularly in directed graphs, with multi-class and multi-task strategies showing superior performance over traditional methods. Lorentzian Residual Neural Networks (LResNet) have also emerged as a robust solution for integrating residual connections in hyperbolic neural networks, enhancing stability and efficiency. Furthermore, advancements in hyperparameter selection for KGEs, such as the extension of the TWIG model, promise to streamline the process of optimizing KGE models for various graph structures, potentially enabling zero-shot predictive capabilities. These developments collectively push the boundaries of what is achievable in graph-based machine learning, offering new tools and insights for both academic research and practical applications.