In intensive high-density recirculating aquaculture systems, real-time detection of shrimp vitality state and death time estimation are critical technical challenges for ensuring production efficiency and product quality. The decay of dead shrimp rapidly pollutes water, making timely death time determination essential to pinpoint pollution sources, reduce the time dead shrimp remain in the water, and mitigate the risk of mass mortality. This study presents a deep learning-based method for determining shrimp death state and predicting time of death. By developing an improved YOLOv11 object detection algorithm, we construct a multi-task detection network. Beyond enabling accurate classification into three vitality states (Living, Half-Dead, and Dead), the network incorporates key point detection to capture spatial variations of the shrimp head, body, and tail. Combined with the Cosine Theorem, postural angles characterizing body curl are calculated. A polynomial regression model is established to quantify the relationship between postural angle and death time. Experimental results on a dataset of 2800 image samples show that the model achieved a detection accuracy (F1-score) of 96%. The time of death prediction had a mean absolute error of 6.3 minutes. This method not only overcomes a limitation of traditional visual inspection, which can only determine life status, but also enables retrospective death time estimation through posture analysis. It offers a novel technical approach for intelligent aquaculture monitoring and quality assessment, significantly contributing to reducing production risks and enhancing the commercial value of shrimp products.