Become a leader in the IoT community!
Join our community of embedded and IoT practitioners to contribute experience, learn new skills and collaborate with other developers with complementary skillsets.
Join our community of embedded and IoT practitioners to contribute experience, learn new skills and collaborate with other developers with complementary skillsets.
Hey everyone! I’m fairly new to embedded development and have hit a puzzling issue I hope someone can shed some light on.
I’m working on a project using the nRF52840 with embassy-rs, and I’m interfacing with an SD card over SPI. There’s a GPIO pin used to detect if the card is present, and once it’s inserted, I want to wait ~500ms before interacting with the card.
I got this working but here’s the strange part:
This only works if I have RTT attached. As soon as I run the code without attaching RTT before setting up the SPI bus, all I get from the bus will be 0x00. If I simply remove the delay, I’ll get proper 0xFF and can reset the card into SPI mode just fine, even without RTT attached.
I’d love to understand what’s going on here. Has anyone experienced something similar or have ideas about what might be causing this? I can also share a short code sample if desired.
Hi @xuxuck so Your SD card needs time to power up after insertion. RTT debugging adds this time.The 500ms delay works because it gives the card enough time to initialize properly, when you remove the delay, you’re trying to talk to the card too soon.
Yes, this is the main reason I have added the delay in the first place. Unexpectedly though, without RTT it only works if there is NO delay. As soon as I introduce a delay there, say 100ms, 500ms or 5000ms, I only get 0x00 from the bus. With RTT attached, the delay does not matter.
CONTRIBUTE TO THIS THREAD