Abstract:
Existing research has shown that there are hidden features between the global and local features of point cloud, and the representation capability of point cloud can be enhanced by mining and utilizing the hidden features. However, existing theories have not delved deeply into the analysis and utilization of hidden features. To further exploit the hidden features, we propose an Attention-based Hidden Feature Utilization (AHU) module, which consists of two sub-modules. On one hand, a sub-module based on channel attention mechanism enhances the inter-channel dependencies of features, improving the significance of hidden features; on the other hand, another sub-module based on cross-attention mechanism projects the learned hidden features back to the original local features, establishing long-distance dependencies between them and promoting information fusion, such that the generalization ability of the module can be improved. This paper extends the theory about hidden features and the experimental results demonstrate that the AHU module can be integrated into existing state-of-the-art networks to significantly improve the performance.