Become a leader in the IoT community!
Join our community of embedded and IoT practitioners to contribute experience, learn new skills and collaborate with other developers with complementary skillsets.
Join our community of embedded and IoT practitioners to contribute experience, learn new skills and collaborate with other developers with complementary skillsets.
Hi guys on an ESP32 using TensorFlow Lite for Microcontrollers, I encounter the error `RuntimeError: Failed to run model at node 13 with status 1` during inference, often with longer input sequences. I’ve tried model quantization, simplifying inputs, and ensuring proper formatting. What advanced debugging or memory management techniques can help resolve this issue, and are there specific TensorFlow Lite settings to improve stability?
CONTRIBUTE TO THIS THREAD