Jet tagging is a central and challenging task in high-energy physics, aiming to identify and classify collimated sprays of particles (jets) produced in proton–proton collisions at the Large Hadron Collider (LHC). Conventional jet tagging methods rely heavily on handcrafted observables and physics-motivated algorithms, which may not fully capture the complex and high-dimensional substructure of particle interactions within jets. Recent advances in deep learning have demonstrated strong potential for overcoming these limitations, significantly improving performance across a wide range of particle physics applications. In this work, we investigate the use of convolutional neural networks (CNNs) for jet tagging using the JETCLASS benchmark dataset, which comprises 100 million training jets, 5 million validation jets, and 20 million test jets. We adopt MobileNetV3 as the backbone architecture for jet classification. Originally designed for efficient image recognition, MobileNetV3 leverages depthwise separable convolutions, lightweight attention mechanisms, and optimized network scaling to achieve high accuracy with reduced computational complexity. Our results demonstrate that MobileNetV3 outperforms established baseline models, including the Particle Flow Network (PFN) and the Particle Convolutional Neural Network (PCNN), on the JETCLASS dataset. These findings indicate that modern lightweight CNN architectures can effectively capture the intrinsic structure of particle jets while maintaining computational efficiency. This study highlights the promise of efficient CNN-based models for scalable and robust jet tagging at the LHC.